Mar 18 15:35:55 crc systemd[1]: Starting Kubernetes Kubelet... Mar 18 15:35:55 crc restorecon[4692]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:55 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:35:56 crc restorecon[4692]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:35:56 crc restorecon[4692]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 18 15:35:57 crc kubenswrapper[4696]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 15:35:57 crc kubenswrapper[4696]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 15:35:57 crc kubenswrapper[4696]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 15:35:57 crc kubenswrapper[4696]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 15:35:57 crc kubenswrapper[4696]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 18 15:35:57 crc kubenswrapper[4696]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.389882 4696 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398402 4696 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398449 4696 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398459 4696 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398468 4696 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398476 4696 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398486 4696 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398494 4696 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398502 4696 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398510 4696 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398541 4696 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398550 4696 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398558 4696 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398569 4696 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398581 4696 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398591 4696 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398601 4696 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398610 4696 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398618 4696 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398627 4696 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398635 4696 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398643 4696 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398652 4696 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398660 4696 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398668 4696 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398676 4696 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398684 4696 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398697 4696 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398708 4696 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398717 4696 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398728 4696 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398749 4696 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398759 4696 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398767 4696 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398776 4696 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398784 4696 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398792 4696 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398799 4696 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398810 4696 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398818 4696 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398826 4696 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398833 4696 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398841 4696 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398849 4696 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398857 4696 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398865 4696 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398872 4696 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398880 4696 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398888 4696 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398896 4696 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398904 4696 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398912 4696 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398920 4696 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398930 4696 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398939 4696 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398947 4696 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398955 4696 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398962 4696 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398971 4696 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398979 4696 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398987 4696 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.398995 4696 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.399002 4696 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.399009 4696 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.399017 4696 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.399025 4696 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.399032 4696 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.399040 4696 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.399047 4696 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.399055 4696 feature_gate.go:330] unrecognized feature gate: Example Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.399062 4696 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.399070 4696 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399271 4696 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399293 4696 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399316 4696 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399327 4696 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399339 4696 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399348 4696 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399360 4696 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399371 4696 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399380 4696 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399390 4696 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399399 4696 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399412 4696 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399421 4696 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399431 4696 flags.go:64] FLAG: --cgroup-root="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399440 4696 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399448 4696 flags.go:64] FLAG: --client-ca-file="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399457 4696 flags.go:64] FLAG: --cloud-config="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399466 4696 flags.go:64] FLAG: --cloud-provider="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399477 4696 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399488 4696 flags.go:64] FLAG: --cluster-domain="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399497 4696 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399506 4696 flags.go:64] FLAG: --config-dir="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399515 4696 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399552 4696 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399563 4696 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399572 4696 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399582 4696 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399592 4696 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399602 4696 flags.go:64] FLAG: --contention-profiling="false" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399611 4696 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399620 4696 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399630 4696 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399639 4696 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399650 4696 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399659 4696 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399668 4696 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399676 4696 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399686 4696 flags.go:64] FLAG: --enable-server="true" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399696 4696 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399708 4696 flags.go:64] FLAG: --event-burst="100" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399718 4696 flags.go:64] FLAG: --event-qps="50" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399727 4696 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399736 4696 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399745 4696 flags.go:64] FLAG: --eviction-hard="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399756 4696 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399765 4696 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399774 4696 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399783 4696 flags.go:64] FLAG: --eviction-soft="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399792 4696 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399802 4696 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399817 4696 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399826 4696 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399835 4696 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399844 4696 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399853 4696 flags.go:64] FLAG: --feature-gates="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399863 4696 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399872 4696 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399883 4696 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399913 4696 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399923 4696 flags.go:64] FLAG: --healthz-port="10248" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399932 4696 flags.go:64] FLAG: --help="false" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399942 4696 flags.go:64] FLAG: --hostname-override="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399950 4696 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399959 4696 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399968 4696 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399977 4696 flags.go:64] FLAG: --image-credential-provider-config="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399986 4696 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.399995 4696 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400003 4696 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400012 4696 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400021 4696 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400030 4696 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400040 4696 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400049 4696 flags.go:64] FLAG: --kube-reserved="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400058 4696 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400067 4696 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400076 4696 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400085 4696 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400093 4696 flags.go:64] FLAG: --lock-file="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400102 4696 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400111 4696 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400120 4696 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400145 4696 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400157 4696 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400165 4696 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400175 4696 flags.go:64] FLAG: --logging-format="text" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400184 4696 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400193 4696 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400202 4696 flags.go:64] FLAG: --manifest-url="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400211 4696 flags.go:64] FLAG: --manifest-url-header="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400222 4696 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400232 4696 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400243 4696 flags.go:64] FLAG: --max-pods="110" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400252 4696 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400261 4696 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400270 4696 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400279 4696 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400289 4696 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400298 4696 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400308 4696 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400327 4696 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400337 4696 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400346 4696 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400355 4696 flags.go:64] FLAG: --pod-cidr="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400364 4696 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400376 4696 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400385 4696 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400394 4696 flags.go:64] FLAG: --pods-per-core="0" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400403 4696 flags.go:64] FLAG: --port="10250" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400413 4696 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400422 4696 flags.go:64] FLAG: --provider-id="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400431 4696 flags.go:64] FLAG: --qos-reserved="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400440 4696 flags.go:64] FLAG: --read-only-port="10255" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400448 4696 flags.go:64] FLAG: --register-node="true" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400457 4696 flags.go:64] FLAG: --register-schedulable="true" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400468 4696 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400482 4696 flags.go:64] FLAG: --registry-burst="10" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400491 4696 flags.go:64] FLAG: --registry-qps="5" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400499 4696 flags.go:64] FLAG: --reserved-cpus="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400509 4696 flags.go:64] FLAG: --reserved-memory="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400547 4696 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400556 4696 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400566 4696 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400575 4696 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400583 4696 flags.go:64] FLAG: --runonce="false" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400592 4696 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400601 4696 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400611 4696 flags.go:64] FLAG: --seccomp-default="false" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400619 4696 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400628 4696 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400637 4696 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400647 4696 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400656 4696 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400665 4696 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400673 4696 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400682 4696 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400691 4696 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400700 4696 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400710 4696 flags.go:64] FLAG: --system-cgroups="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400718 4696 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400732 4696 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400741 4696 flags.go:64] FLAG: --tls-cert-file="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400750 4696 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400761 4696 flags.go:64] FLAG: --tls-min-version="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400770 4696 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400778 4696 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400787 4696 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400797 4696 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400807 4696 flags.go:64] FLAG: --v="2" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400818 4696 flags.go:64] FLAG: --version="false" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400829 4696 flags.go:64] FLAG: --vmodule="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400839 4696 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.400849 4696 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401060 4696 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401072 4696 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401082 4696 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401090 4696 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401098 4696 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401106 4696 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401114 4696 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401121 4696 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401130 4696 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401138 4696 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401146 4696 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401154 4696 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401161 4696 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401169 4696 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401176 4696 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401184 4696 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401192 4696 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401201 4696 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401208 4696 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401216 4696 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401223 4696 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401231 4696 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401239 4696 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401246 4696 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401254 4696 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401265 4696 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401273 4696 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401281 4696 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401289 4696 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401297 4696 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401304 4696 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401312 4696 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401322 4696 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401332 4696 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401342 4696 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401351 4696 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401359 4696 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401368 4696 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401379 4696 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401388 4696 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401396 4696 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401404 4696 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401413 4696 feature_gate.go:330] unrecognized feature gate: Example Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401421 4696 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401429 4696 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401437 4696 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401445 4696 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401454 4696 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401461 4696 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401469 4696 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401477 4696 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401484 4696 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401495 4696 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401504 4696 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401512 4696 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401551 4696 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401562 4696 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401580 4696 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401591 4696 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401604 4696 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401614 4696 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401624 4696 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401634 4696 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401644 4696 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401652 4696 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401659 4696 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401667 4696 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401675 4696 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401683 4696 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401691 4696 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.401698 4696 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.401725 4696 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.409724 4696 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.409764 4696 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409827 4696 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409836 4696 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409846 4696 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409851 4696 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409856 4696 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409861 4696 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409866 4696 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409872 4696 feature_gate.go:330] unrecognized feature gate: Example Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409878 4696 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409884 4696 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409889 4696 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409894 4696 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409899 4696 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409904 4696 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409909 4696 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409913 4696 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409917 4696 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409921 4696 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409925 4696 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409929 4696 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409932 4696 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409936 4696 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409940 4696 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409944 4696 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409947 4696 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409951 4696 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409954 4696 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409958 4696 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409962 4696 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409966 4696 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409969 4696 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409973 4696 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409976 4696 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409979 4696 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409985 4696 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409990 4696 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409994 4696 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.409997 4696 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410001 4696 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410006 4696 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410009 4696 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410013 4696 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410017 4696 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410020 4696 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410024 4696 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410027 4696 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410031 4696 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410034 4696 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410038 4696 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410041 4696 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410044 4696 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410048 4696 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410051 4696 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410055 4696 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410058 4696 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410062 4696 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410065 4696 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410069 4696 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410073 4696 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410076 4696 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410080 4696 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410083 4696 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410087 4696 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410090 4696 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410093 4696 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410098 4696 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410101 4696 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410104 4696 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410108 4696 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410111 4696 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410115 4696 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.410122 4696 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410247 4696 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410252 4696 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410258 4696 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410262 4696 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410266 4696 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410270 4696 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410273 4696 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410278 4696 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410283 4696 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410288 4696 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410292 4696 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410296 4696 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410300 4696 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410304 4696 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410307 4696 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410311 4696 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410315 4696 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410319 4696 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410322 4696 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410326 4696 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410329 4696 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410333 4696 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410336 4696 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410340 4696 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410344 4696 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410347 4696 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410350 4696 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410354 4696 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410357 4696 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410361 4696 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410364 4696 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410367 4696 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410370 4696 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410374 4696 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410378 4696 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410382 4696 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410385 4696 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410389 4696 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410392 4696 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410396 4696 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410399 4696 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410402 4696 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410406 4696 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410409 4696 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410413 4696 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410417 4696 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410421 4696 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410424 4696 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410428 4696 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410431 4696 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410435 4696 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410438 4696 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410442 4696 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410445 4696 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410449 4696 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410452 4696 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410456 4696 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410459 4696 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410463 4696 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410468 4696 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410472 4696 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410477 4696 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410482 4696 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410486 4696 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410491 4696 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410495 4696 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410499 4696 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410502 4696 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410506 4696 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410509 4696 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.410513 4696 feature_gate.go:330] unrecognized feature gate: Example Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.410533 4696 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.410824 4696 server.go:940] "Client rotation is on, will bootstrap in background" Mar 18 15:35:57 crc kubenswrapper[4696]: E0318 15:35:57.414392 4696 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.417463 4696 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.417592 4696 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.418998 4696 server.go:997] "Starting client certificate rotation" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.419023 4696 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.419144 4696 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.444452 4696 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 15:35:57 crc kubenswrapper[4696]: E0318 15:35:57.446896 4696 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.448298 4696 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.464008 4696 log.go:25] "Validated CRI v1 runtime API" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.499644 4696 log.go:25] "Validated CRI v1 image API" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.501582 4696 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.506382 4696 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-18-15-31-30-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.506435 4696 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.527844 4696 manager.go:217] Machine: {Timestamp:2026-03-18 15:35:57.52534203 +0000 UTC m=+0.531516276 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:a02dd351-206a-4946-acba-446bc8ebd92d BootID:33442fad-71cc-47a2-b717-94dce6899c46 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:de:62:b8 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:de:62:b8 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:bc:3c:c6 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:a4:ee:0d Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:99:d1:13 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:7c:d1:fc Speed:-1 Mtu:1496} {Name:eth10 MacAddress:a2:5b:4d:f6:b3:54 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:82:bf:a6:4f:01:8b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.528138 4696 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.528362 4696 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.528903 4696 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.529159 4696 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.529217 4696 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.530416 4696 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.530443 4696 container_manager_linux.go:303] "Creating device plugin manager" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.531019 4696 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.531054 4696 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.531258 4696 state_mem.go:36] "Initialized new in-memory state store" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.531366 4696 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.534823 4696 kubelet.go:418] "Attempting to sync node with API server" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.534852 4696 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.534880 4696 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.534897 4696 kubelet.go:324] "Adding apiserver pod source" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.534909 4696 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.538300 4696 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.539208 4696 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.542613 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 18 15:35:57 crc kubenswrapper[4696]: E0318 15:35:57.542719 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.542557 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 18 15:35:57 crc kubenswrapper[4696]: E0318 15:35:57.542755 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.543103 4696 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.544644 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.544753 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.544817 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.544873 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.544929 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.544978 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.545032 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.545085 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.545141 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.545194 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.545284 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.545376 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.546905 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.547560 4696 server.go:1280] "Started kubelet" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.548947 4696 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 15:35:57 crc systemd[1]: Started Kubernetes Kubelet. Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.549067 4696 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.549244 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.549553 4696 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.550136 4696 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.550195 4696 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 15:35:57 crc kubenswrapper[4696]: E0318 15:35:57.550285 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.550316 4696 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.550293 4696 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.550351 4696 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 18 15:35:57 crc kubenswrapper[4696]: E0318 15:35:57.550981 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="200ms" Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.550998 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 18 15:35:57 crc kubenswrapper[4696]: E0318 15:35:57.551053 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.551191 4696 factory.go:55] Registering systemd factory Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.551213 4696 factory.go:221] Registration of the systemd container factory successfully Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.553257 4696 server.go:460] "Adding debug handlers to kubelet server" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.553703 4696 factory.go:153] Registering CRI-O factory Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.553779 4696 factory.go:221] Registration of the crio container factory successfully Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.553985 4696 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.554020 4696 factory.go:103] Registering Raw factory Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.554038 4696 manager.go:1196] Started watching for new ooms in manager Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.554468 4696 manager.go:319] Starting recovery of all containers Mar 18 15:35:57 crc kubenswrapper[4696]: E0318 15:35:57.555879 4696 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.192:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189df9804419a8a7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.547505831 +0000 UTC m=+0.553680057,LastTimestamp:2026-03-18 15:35:57.547505831 +0000 UTC m=+0.553680057,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.563832 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.563900 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.563926 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.563937 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.563947 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.563956 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.563966 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.563976 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.563987 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.563997 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.564006 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.564014 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.564025 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.564037 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.564046 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.564055 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.565655 4696 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.565763 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.565831 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.565895 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.565952 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.566036 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.566113 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.566171 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.566230 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.566313 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.566373 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.566433 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.566511 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.566607 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.566676 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.566737 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.566795 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.566877 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.566939 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.566997 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.567073 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.567131 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.567192 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.567248 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.567307 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.567390 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.567447 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.567506 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.567640 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.567735 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.567794 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.567857 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.567918 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.567982 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.568040 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.568096 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.568157 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.568226 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.568290 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.568353 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.568423 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.568494 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.568576 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.568654 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.568748 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.568815 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.568901 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.568957 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.569019 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.569095 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.569160 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.569219 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.569272 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.569328 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.569393 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.569449 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.569509 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.569586 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.569652 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.569721 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.569801 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.569883 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.569959 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.570028 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.570091 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.570154 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.570211 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.570266 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.570327 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.570387 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.570448 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.570506 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.570605 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.570688 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.570765 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.570848 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.570945 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.571030 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.571106 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.571191 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.571275 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.571344 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.571405 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.571464 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.571541 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.571608 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.571686 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.571748 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.571810 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.571884 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.571946 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.572796 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.573084 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.573147 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.573211 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.573321 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.573404 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.573492 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.573630 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.573727 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.573508 4696 manager.go:324] Recovery completed Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.573790 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.573917 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.573982 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.574050 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.574117 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.574182 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.574241 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.574320 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.574377 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.574442 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.574498 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.574578 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.574637 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.574696 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.574753 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.574813 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.574868 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.574932 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.574986 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.575044 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.575101 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.575160 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.575226 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.575285 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.575346 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.575406 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.575463 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.575536 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.575602 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.575659 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.575753 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.575822 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.575885 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.575949 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.576003 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.576062 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.576123 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.576189 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.576248 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.576306 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.576368 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.576429 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.576482 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.576558 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.576623 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.576686 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.576759 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.576832 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.576907 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.576971 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.577027 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.577084 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.577147 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.577209 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.577270 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.577330 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.577385 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.577451 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.577509 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.577610 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.577683 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.577741 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.577796 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.577853 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.577921 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.577986 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.578039 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.578097 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.578152 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.578209 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.578268 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.578326 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.578398 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.578460 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.578554 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.578630 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.578705 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.578791 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.578852 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.578907 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.578959 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.579016 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.579071 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.579126 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.579178 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.579232 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.579287 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.579341 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.579399 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.579460 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.579512 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.579591 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.579650 4696 reconstruct.go:97] "Volume reconstruction finished" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.579719 4696 reconciler.go:26] "Reconciler: start to sync state" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.584390 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.585648 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.585687 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.585700 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.586501 4696 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.586542 4696 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.586566 4696 state_mem.go:36] "Initialized new in-memory state store" Mar 18 15:35:57 crc kubenswrapper[4696]: E0318 15:35:57.588669 4696 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.192:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189df9804419a8a7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.547505831 +0000 UTC m=+0.553680057,LastTimestamp:2026-03-18 15:35:57.547505831 +0000 UTC m=+0.553680057,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.594587 4696 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.596115 4696 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.596188 4696 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.596210 4696 kubelet.go:2335] "Starting kubelet main sync loop" Mar 18 15:35:57 crc kubenswrapper[4696]: E0318 15:35:57.596650 4696 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 18 15:35:57 crc kubenswrapper[4696]: W0318 15:35:57.596728 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 18 15:35:57 crc kubenswrapper[4696]: E0318 15:35:57.596771 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.605154 4696 policy_none.go:49] "None policy: Start" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.605976 4696 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.606096 4696 state_mem.go:35] "Initializing new in-memory state store" Mar 18 15:35:57 crc kubenswrapper[4696]: E0318 15:35:57.650941 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.660558 4696 manager.go:334] "Starting Device Plugin manager" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.660631 4696 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.660643 4696 server.go:79] "Starting device plugin registration server" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.661598 4696 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.661642 4696 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.661946 4696 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.662021 4696 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.662027 4696 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 15:35:57 crc kubenswrapper[4696]: E0318 15:35:57.667367 4696 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.697080 4696 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.697173 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.698936 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.698967 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.698976 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.699067 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.699393 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.699574 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.699834 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.699887 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.699897 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.699961 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.700066 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.700101 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.700913 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.700938 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.700949 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.701163 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.701187 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.701198 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.701279 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.701343 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.701366 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.701376 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.701586 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.701711 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.701918 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.701947 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.701958 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.702129 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.702266 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.702304 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.703715 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.703743 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.703756 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.703914 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.704021 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.704123 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.704914 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.704974 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.704986 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.705236 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.705278 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.706726 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.706758 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.706768 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:57 crc kubenswrapper[4696]: E0318 15:35:57.751482 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="400ms" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.762540 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.763559 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.763589 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.763603 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.763630 4696 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:35:57 crc kubenswrapper[4696]: E0318 15:35:57.764102 4696 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.192:6443: connect: connection refused" node="crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.782435 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.782485 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.782513 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.782671 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.782738 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.782762 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.782786 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.782806 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.782827 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.782875 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.782914 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.782936 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.782988 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.783012 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.783033 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.883749 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.883787 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.883805 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.883822 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.883839 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.883864 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.883877 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.883891 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.883909 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.883923 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.883935 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.883940 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.883989 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.884011 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.884018 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.883948 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.884030 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.884069 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.884082 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.884092 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.884102 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.884113 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.884116 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.884135 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.884157 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.884180 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.884200 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.884216 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.884231 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.884246 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.965014 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.966001 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.966038 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.966049 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:57 crc kubenswrapper[4696]: I0318 15:35:57.966073 4696 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:35:57 crc kubenswrapper[4696]: E0318 15:35:57.966568 4696 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.192:6443: connect: connection refused" node="crc" Mar 18 15:35:58 crc kubenswrapper[4696]: I0318 15:35:58.038432 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:35:58 crc kubenswrapper[4696]: I0318 15:35:58.066108 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:35:58 crc kubenswrapper[4696]: I0318 15:35:58.081013 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 15:35:58 crc kubenswrapper[4696]: W0318 15:35:58.081553 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-c7ba9690e8d99e1c2878eb3b0702725d624675449fcdb782db8c09d1482b1ecf WatchSource:0}: Error finding container c7ba9690e8d99e1c2878eb3b0702725d624675449fcdb782db8c09d1482b1ecf: Status 404 returned error can't find the container with id c7ba9690e8d99e1c2878eb3b0702725d624675449fcdb782db8c09d1482b1ecf Mar 18 15:35:58 crc kubenswrapper[4696]: W0318 15:35:58.097488 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-a5cac60ab3f707366e47099063f8e5d8f475f75f914f3c08bb2ac4ee24881c7f WatchSource:0}: Error finding container a5cac60ab3f707366e47099063f8e5d8f475f75f914f3c08bb2ac4ee24881c7f: Status 404 returned error can't find the container with id a5cac60ab3f707366e47099063f8e5d8f475f75f914f3c08bb2ac4ee24881c7f Mar 18 15:35:58 crc kubenswrapper[4696]: W0318 15:35:58.097901 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-c3643db04b418b276a6841b09f4479f14574aea8519a1f2eb1555d3a8476bffc WatchSource:0}: Error finding container c3643db04b418b276a6841b09f4479f14574aea8519a1f2eb1555d3a8476bffc: Status 404 returned error can't find the container with id c3643db04b418b276a6841b09f4479f14574aea8519a1f2eb1555d3a8476bffc Mar 18 15:35:58 crc kubenswrapper[4696]: I0318 15:35:58.102630 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 15:35:58 crc kubenswrapper[4696]: I0318 15:35:58.106750 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:35:58 crc kubenswrapper[4696]: W0318 15:35:58.123360 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-136a5483d4b0bb95579a831c025ae1b2ef185a45b6a43d03e5ad3b28b6eadcc9 WatchSource:0}: Error finding container 136a5483d4b0bb95579a831c025ae1b2ef185a45b6a43d03e5ad3b28b6eadcc9: Status 404 returned error can't find the container with id 136a5483d4b0bb95579a831c025ae1b2ef185a45b6a43d03e5ad3b28b6eadcc9 Mar 18 15:35:58 crc kubenswrapper[4696]: W0318 15:35:58.134024 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-ea0407ca0237f140cd53d2b8bd51c224447134817adc88c7f64d917f811b42be WatchSource:0}: Error finding container ea0407ca0237f140cd53d2b8bd51c224447134817adc88c7f64d917f811b42be: Status 404 returned error can't find the container with id ea0407ca0237f140cd53d2b8bd51c224447134817adc88c7f64d917f811b42be Mar 18 15:35:58 crc kubenswrapper[4696]: E0318 15:35:58.152984 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="800ms" Mar 18 15:35:58 crc kubenswrapper[4696]: I0318 15:35:58.367399 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:58 crc kubenswrapper[4696]: I0318 15:35:58.369128 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:58 crc kubenswrapper[4696]: I0318 15:35:58.369181 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:58 crc kubenswrapper[4696]: I0318 15:35:58.369192 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:58 crc kubenswrapper[4696]: I0318 15:35:58.369230 4696 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:35:58 crc kubenswrapper[4696]: E0318 15:35:58.369952 4696 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.192:6443: connect: connection refused" node="crc" Mar 18 15:35:58 crc kubenswrapper[4696]: I0318 15:35:58.550884 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 18 15:35:58 crc kubenswrapper[4696]: I0318 15:35:58.601340 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"136a5483d4b0bb95579a831c025ae1b2ef185a45b6a43d03e5ad3b28b6eadcc9"} Mar 18 15:35:58 crc kubenswrapper[4696]: W0318 15:35:58.602635 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 18 15:35:58 crc kubenswrapper[4696]: E0318 15:35:58.602762 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:35:58 crc kubenswrapper[4696]: I0318 15:35:58.602865 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c3643db04b418b276a6841b09f4479f14574aea8519a1f2eb1555d3a8476bffc"} Mar 18 15:35:58 crc kubenswrapper[4696]: I0318 15:35:58.604206 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a5cac60ab3f707366e47099063f8e5d8f475f75f914f3c08bb2ac4ee24881c7f"} Mar 18 15:35:58 crc kubenswrapper[4696]: I0318 15:35:58.605262 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c7ba9690e8d99e1c2878eb3b0702725d624675449fcdb782db8c09d1482b1ecf"} Mar 18 15:35:58 crc kubenswrapper[4696]: I0318 15:35:58.606444 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ea0407ca0237f140cd53d2b8bd51c224447134817adc88c7f64d917f811b42be"} Mar 18 15:35:58 crc kubenswrapper[4696]: W0318 15:35:58.833773 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 18 15:35:58 crc kubenswrapper[4696]: E0318 15:35:58.834459 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:35:58 crc kubenswrapper[4696]: E0318 15:35:58.954589 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="1.6s" Mar 18 15:35:58 crc kubenswrapper[4696]: W0318 15:35:58.968002 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 18 15:35:58 crc kubenswrapper[4696]: E0318 15:35:58.968078 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:35:58 crc kubenswrapper[4696]: W0318 15:35:58.994058 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 18 15:35:58 crc kubenswrapper[4696]: E0318 15:35:58.994184 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.170569 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.172761 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.172817 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.172832 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.172862 4696 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:35:59 crc kubenswrapper[4696]: E0318 15:35:59.173458 4696 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.192:6443: connect: connection refused" node="crc" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.494833 4696 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 15:35:59 crc kubenswrapper[4696]: E0318 15:35:59.496066 4696 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.550721 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.610425 4696 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a" exitCode=0 Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.610565 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.610567 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a"} Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.611366 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.611403 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.611413 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.613268 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4eed26715af41fdb53a2c1eac925016c55579075ae588b5c591531d19b21f622"} Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.613306 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"75c6be4af51a48cef4e53dec757f47fb7ed489c5c183cd4408791f9c1dfa2dc1"} Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.613318 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5829b2c642f05cf5682b8e8cbe2079b16f6a4b046e773d51f36011ffcb50ce14"} Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.613328 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a7b4da8be7e22b17201dd8593f7e1b3943344f19bef3957c76cb18f2c7427335"} Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.613405 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.614372 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.614415 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.614428 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.633359 4696 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="7a4f2ab0c9da67f71a0761016cf09d58cdc97abb4f88197165e533cd4346c199" exitCode=0 Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.633600 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.633594 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"7a4f2ab0c9da67f71a0761016cf09d58cdc97abb4f88197165e533cd4346c199"} Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.635068 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.635102 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.635113 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.637763 4696 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2" exitCode=0 Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.637846 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2"} Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.637974 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.639644 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.639684 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.639699 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.643555 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.645663 4696 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="883ca3ee77443187b0bed76a9fb774a3afa002fa4cf735ad4e5c069b16dc7de1" exitCode=0 Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.645908 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"883ca3ee77443187b0bed76a9fb774a3afa002fa4cf735ad4e5c069b16dc7de1"} Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.645937 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.646910 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.646943 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.646957 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.654328 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.654402 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:59 crc kubenswrapper[4696]: I0318 15:35:59.654425 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:00 crc kubenswrapper[4696]: W0318 15:36:00.468361 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 18 15:36:00 crc kubenswrapper[4696]: E0318 15:36:00.468491 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.550182 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 18 15:36:00 crc kubenswrapper[4696]: E0318 15:36:00.556074 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="3.2s" Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.650358 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"30c1df36f7f5aa9da678a3b7317fd6caeabd0b8abacf0f95d6af40b599e978e3"} Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.650393 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"75378392c29b28f1271ad32f90c030a02e20efaeb9f4e33b462c445d31427213"} Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.650404 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b6ad97f0893d14187ba6a594d37301ce91193e41d97cba600a26acde8bf16746"} Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.650427 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.651376 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.651397 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.651405 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.654835 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9"} Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.654882 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0"} Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.654916 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619"} Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.654930 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7"} Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.656905 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"332711dc63c9bf3d6c8a9cbba45bb6085d89f15b1c238e5a3a57e381f6804159"} Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.656950 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.657780 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.657813 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.657821 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.659734 4696 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9" exitCode=0 Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.659823 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.659843 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9"} Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.659909 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.660980 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.661017 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.661030 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.661978 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.662027 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.662040 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:00 crc kubenswrapper[4696]: W0318 15:36:00.760817 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 18 15:36:00 crc kubenswrapper[4696]: E0318 15:36:00.760921 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.774225 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.775397 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.775432 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.775441 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:00 crc kubenswrapper[4696]: I0318 15:36:00.775486 4696 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:36:00 crc kubenswrapper[4696]: E0318 15:36:00.776087 4696 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.192:6443: connect: connection refused" node="crc" Mar 18 15:36:01 crc kubenswrapper[4696]: I0318 15:36:01.664182 4696 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633" exitCode=0 Mar 18 15:36:01 crc kubenswrapper[4696]: I0318 15:36:01.664252 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633"} Mar 18 15:36:01 crc kubenswrapper[4696]: I0318 15:36:01.664329 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:01 crc kubenswrapper[4696]: I0318 15:36:01.668349 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:01 crc kubenswrapper[4696]: I0318 15:36:01.668398 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:01 crc kubenswrapper[4696]: I0318 15:36:01.668412 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:01 crc kubenswrapper[4696]: I0318 15:36:01.671869 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7444d4d18264c9b4599c5ae5ad98366c7ac37ce1f5fb23c75d112bc284fa5825"} Mar 18 15:36:01 crc kubenswrapper[4696]: I0318 15:36:01.671907 4696 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 15:36:01 crc kubenswrapper[4696]: I0318 15:36:01.671941 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:01 crc kubenswrapper[4696]: I0318 15:36:01.672041 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:01 crc kubenswrapper[4696]: I0318 15:36:01.671942 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:01 crc kubenswrapper[4696]: I0318 15:36:01.672714 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:01 crc kubenswrapper[4696]: I0318 15:36:01.672757 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:01 crc kubenswrapper[4696]: I0318 15:36:01.672775 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:01 crc kubenswrapper[4696]: I0318 15:36:01.673121 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:01 crc kubenswrapper[4696]: I0318 15:36:01.673132 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:01 crc kubenswrapper[4696]: I0318 15:36:01.673150 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:01 crc kubenswrapper[4696]: I0318 15:36:01.673150 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:01 crc kubenswrapper[4696]: I0318 15:36:01.673159 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:01 crc kubenswrapper[4696]: I0318 15:36:01.673167 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:02 crc kubenswrapper[4696]: I0318 15:36:02.090998 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:36:02 crc kubenswrapper[4696]: I0318 15:36:02.091253 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:02 crc kubenswrapper[4696]: I0318 15:36:02.093953 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:02 crc kubenswrapper[4696]: I0318 15:36:02.094030 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:02 crc kubenswrapper[4696]: I0318 15:36:02.094068 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:02 crc kubenswrapper[4696]: I0318 15:36:02.102129 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:36:02 crc kubenswrapper[4696]: I0318 15:36:02.518496 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:36:02 crc kubenswrapper[4696]: I0318 15:36:02.682684 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5"} Mar 18 15:36:02 crc kubenswrapper[4696]: I0318 15:36:02.682788 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff"} Mar 18 15:36:02 crc kubenswrapper[4696]: I0318 15:36:02.682807 4696 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 15:36:02 crc kubenswrapper[4696]: I0318 15:36:02.682823 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc"} Mar 18 15:36:02 crc kubenswrapper[4696]: I0318 15:36:02.682852 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b"} Mar 18 15:36:02 crc kubenswrapper[4696]: I0318 15:36:02.682878 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:02 crc kubenswrapper[4696]: I0318 15:36:02.682938 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:02 crc kubenswrapper[4696]: I0318 15:36:02.684369 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:02 crc kubenswrapper[4696]: I0318 15:36:02.684449 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:02 crc kubenswrapper[4696]: I0318 15:36:02.684477 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:02 crc kubenswrapper[4696]: I0318 15:36:02.684861 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:02 crc kubenswrapper[4696]: I0318 15:36:02.684907 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:02 crc kubenswrapper[4696]: I0318 15:36:02.684922 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:03 crc kubenswrapper[4696]: I0318 15:36:03.337673 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:36:03 crc kubenswrapper[4696]: I0318 15:36:03.552068 4696 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 15:36:03 crc kubenswrapper[4696]: I0318 15:36:03.688429 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52"} Mar 18 15:36:03 crc kubenswrapper[4696]: I0318 15:36:03.688469 4696 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 15:36:03 crc kubenswrapper[4696]: I0318 15:36:03.688510 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:03 crc kubenswrapper[4696]: I0318 15:36:03.688547 4696 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 15:36:03 crc kubenswrapper[4696]: I0318 15:36:03.688590 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:03 crc kubenswrapper[4696]: I0318 15:36:03.689114 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:03 crc kubenswrapper[4696]: I0318 15:36:03.689490 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:03 crc kubenswrapper[4696]: I0318 15:36:03.689543 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:03 crc kubenswrapper[4696]: I0318 15:36:03.689581 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:03 crc kubenswrapper[4696]: I0318 15:36:03.689717 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:03 crc kubenswrapper[4696]: I0318 15:36:03.689734 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:03 crc kubenswrapper[4696]: I0318 15:36:03.689742 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:03 crc kubenswrapper[4696]: I0318 15:36:03.692889 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:03 crc kubenswrapper[4696]: I0318 15:36:03.693014 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:03 crc kubenswrapper[4696]: I0318 15:36:03.693063 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:03 crc kubenswrapper[4696]: I0318 15:36:03.785044 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:36:03 crc kubenswrapper[4696]: I0318 15:36:03.976799 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:03 crc kubenswrapper[4696]: I0318 15:36:03.978068 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:03 crc kubenswrapper[4696]: I0318 15:36:03.978114 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:03 crc kubenswrapper[4696]: I0318 15:36:03.978125 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:03 crc kubenswrapper[4696]: I0318 15:36:03.978156 4696 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:36:04 crc kubenswrapper[4696]: I0318 15:36:04.035672 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:36:04 crc kubenswrapper[4696]: I0318 15:36:04.418131 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:36:04 crc kubenswrapper[4696]: I0318 15:36:04.690895 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:04 crc kubenswrapper[4696]: I0318 15:36:04.691007 4696 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 15:36:04 crc kubenswrapper[4696]: I0318 15:36:04.690900 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:04 crc kubenswrapper[4696]: I0318 15:36:04.691067 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:04 crc kubenswrapper[4696]: I0318 15:36:04.691820 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:04 crc kubenswrapper[4696]: I0318 15:36:04.691861 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:04 crc kubenswrapper[4696]: I0318 15:36:04.691874 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:04 crc kubenswrapper[4696]: I0318 15:36:04.692256 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:04 crc kubenswrapper[4696]: I0318 15:36:04.692290 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:04 crc kubenswrapper[4696]: I0318 15:36:04.692300 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:04 crc kubenswrapper[4696]: I0318 15:36:04.692635 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:04 crc kubenswrapper[4696]: I0318 15:36:04.692676 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:04 crc kubenswrapper[4696]: I0318 15:36:04.692732 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:05 crc kubenswrapper[4696]: I0318 15:36:05.693248 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:05 crc kubenswrapper[4696]: I0318 15:36:05.694130 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:05 crc kubenswrapper[4696]: I0318 15:36:05.694155 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:05 crc kubenswrapper[4696]: I0318 15:36:05.694163 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:06 crc kubenswrapper[4696]: I0318 15:36:06.338558 4696 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:36:06 crc kubenswrapper[4696]: I0318 15:36:06.338632 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 15:36:06 crc kubenswrapper[4696]: I0318 15:36:06.526061 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:36:06 crc kubenswrapper[4696]: I0318 15:36:06.526242 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:06 crc kubenswrapper[4696]: I0318 15:36:06.527189 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:06 crc kubenswrapper[4696]: I0318 15:36:06.527218 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:06 crc kubenswrapper[4696]: I0318 15:36:06.527229 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:07 crc kubenswrapper[4696]: I0318 15:36:07.361362 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:36:07 crc kubenswrapper[4696]: I0318 15:36:07.361560 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:07 crc kubenswrapper[4696]: I0318 15:36:07.363016 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:07 crc kubenswrapper[4696]: I0318 15:36:07.363054 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:07 crc kubenswrapper[4696]: I0318 15:36:07.363063 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:07 crc kubenswrapper[4696]: I0318 15:36:07.543157 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 18 15:36:07 crc kubenswrapper[4696]: I0318 15:36:07.543439 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:07 crc kubenswrapper[4696]: I0318 15:36:07.544638 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:07 crc kubenswrapper[4696]: I0318 15:36:07.544678 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:07 crc kubenswrapper[4696]: I0318 15:36:07.544687 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:07 crc kubenswrapper[4696]: E0318 15:36:07.667492 4696 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 15:36:10 crc kubenswrapper[4696]: I0318 15:36:10.686664 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 18 15:36:10 crc kubenswrapper[4696]: I0318 15:36:10.686820 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:10 crc kubenswrapper[4696]: I0318 15:36:10.687683 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:10 crc kubenswrapper[4696]: I0318 15:36:10.687707 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:10 crc kubenswrapper[4696]: I0318 15:36:10.687715 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:11 crc kubenswrapper[4696]: W0318 15:36:11.302088 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:11Z is after 2026-02-23T05:33:13Z Mar 18 15:36:11 crc kubenswrapper[4696]: E0318 15:36:11.302250 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:36:11 crc kubenswrapper[4696]: I0318 15:36:11.305313 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:11Z is after 2026-02-23T05:33:13Z Mar 18 15:36:11 crc kubenswrapper[4696]: E0318 15:36:11.311471 4696 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:11Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 15:36:11 crc kubenswrapper[4696]: W0318 15:36:11.314291 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:11Z is after 2026-02-23T05:33:13Z Mar 18 15:36:11 crc kubenswrapper[4696]: E0318 15:36:11.314376 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:36:11 crc kubenswrapper[4696]: W0318 15:36:11.314975 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:11Z is after 2026-02-23T05:33:13Z Mar 18 15:36:11 crc kubenswrapper[4696]: E0318 15:36:11.315030 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:36:11 crc kubenswrapper[4696]: E0318 15:36:11.315203 4696 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:11Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189df9804419a8a7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.547505831 +0000 UTC m=+0.553680057,LastTimestamp:2026-03-18 15:35:57.547505831 +0000 UTC m=+0.553680057,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:11 crc kubenswrapper[4696]: E0318 15:36:11.316868 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:11Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 18 15:36:11 crc kubenswrapper[4696]: I0318 15:36:11.317729 4696 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 18 15:36:11 crc kubenswrapper[4696]: I0318 15:36:11.317775 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 15:36:11 crc kubenswrapper[4696]: E0318 15:36:11.318558 4696 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:36:11 crc kubenswrapper[4696]: W0318 15:36:11.320655 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:11Z is after 2026-02-23T05:33:13Z Mar 18 15:36:11 crc kubenswrapper[4696]: E0318 15:36:11.320736 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:36:11 crc kubenswrapper[4696]: I0318 15:36:11.326421 4696 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 18 15:36:11 crc kubenswrapper[4696]: I0318 15:36:11.326787 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 15:36:11 crc kubenswrapper[4696]: I0318 15:36:11.553028 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:11Z is after 2026-02-23T05:33:13Z Mar 18 15:36:11 crc kubenswrapper[4696]: I0318 15:36:11.707104 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 15:36:11 crc kubenswrapper[4696]: I0318 15:36:11.709223 4696 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7444d4d18264c9b4599c5ae5ad98366c7ac37ce1f5fb23c75d112bc284fa5825" exitCode=255 Mar 18 15:36:11 crc kubenswrapper[4696]: I0318 15:36:11.709286 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7444d4d18264c9b4599c5ae5ad98366c7ac37ce1f5fb23c75d112bc284fa5825"} Mar 18 15:36:11 crc kubenswrapper[4696]: I0318 15:36:11.709546 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:11 crc kubenswrapper[4696]: I0318 15:36:11.710914 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:11 crc kubenswrapper[4696]: I0318 15:36:11.710952 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:11 crc kubenswrapper[4696]: I0318 15:36:11.710994 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:11 crc kubenswrapper[4696]: I0318 15:36:11.711906 4696 scope.go:117] "RemoveContainer" containerID="7444d4d18264c9b4599c5ae5ad98366c7ac37ce1f5fb23c75d112bc284fa5825" Mar 18 15:36:12 crc kubenswrapper[4696]: I0318 15:36:12.523696 4696 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]log ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]etcd ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/generic-apiserver-start-informers ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/priority-and-fairness-filter ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/start-apiextensions-informers ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/start-apiextensions-controllers ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/crd-informer-synced ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/start-system-namespaces-controller ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 18 15:36:12 crc kubenswrapper[4696]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/bootstrap-controller ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/start-kube-aggregator-informers ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/apiservice-registration-controller ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/apiservice-discovery-controller ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]autoregister-completion ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/apiservice-openapi-controller ok Mar 18 15:36:12 crc kubenswrapper[4696]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 18 15:36:12 crc kubenswrapper[4696]: livez check failed Mar 18 15:36:12 crc kubenswrapper[4696]: I0318 15:36:12.523800 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:36:12 crc kubenswrapper[4696]: I0318 15:36:12.714616 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 15:36:12 crc kubenswrapper[4696]: I0318 15:36:12.715142 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 15:36:12 crc kubenswrapper[4696]: I0318 15:36:12.717379 4696 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="923b709746ef2524ea57be800f1bae6198816247c09bfb01b6bfaac258f0f5d8" exitCode=255 Mar 18 15:36:12 crc kubenswrapper[4696]: I0318 15:36:12.717470 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"923b709746ef2524ea57be800f1bae6198816247c09bfb01b6bfaac258f0f5d8"} Mar 18 15:36:12 crc kubenswrapper[4696]: I0318 15:36:12.717626 4696 scope.go:117] "RemoveContainer" containerID="7444d4d18264c9b4599c5ae5ad98366c7ac37ce1f5fb23c75d112bc284fa5825" Mar 18 15:36:12 crc kubenswrapper[4696]: I0318 15:36:12.717795 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:12 crc kubenswrapper[4696]: I0318 15:36:12.718723 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:12 crc kubenswrapper[4696]: I0318 15:36:12.718762 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:12 crc kubenswrapper[4696]: I0318 15:36:12.718772 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:12 crc kubenswrapper[4696]: I0318 15:36:12.719449 4696 scope.go:117] "RemoveContainer" containerID="923b709746ef2524ea57be800f1bae6198816247c09bfb01b6bfaac258f0f5d8" Mar 18 15:36:12 crc kubenswrapper[4696]: E0318 15:36:12.719695 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:36:13 crc kubenswrapper[4696]: I0318 15:36:13.126743 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:13Z is after 2026-02-23T05:33:13Z Mar 18 15:36:13 crc kubenswrapper[4696]: I0318 15:36:13.553636 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:13Z is after 2026-02-23T05:33:13Z Mar 18 15:36:13 crc kubenswrapper[4696]: I0318 15:36:13.722640 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 15:36:14 crc kubenswrapper[4696]: I0318 15:36:14.036699 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:36:14 crc kubenswrapper[4696]: I0318 15:36:14.037051 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:14 crc kubenswrapper[4696]: I0318 15:36:14.038931 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:14 crc kubenswrapper[4696]: I0318 15:36:14.039012 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:14 crc kubenswrapper[4696]: I0318 15:36:14.039031 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:14 crc kubenswrapper[4696]: I0318 15:36:14.039792 4696 scope.go:117] "RemoveContainer" containerID="923b709746ef2524ea57be800f1bae6198816247c09bfb01b6bfaac258f0f5d8" Mar 18 15:36:14 crc kubenswrapper[4696]: E0318 15:36:14.040012 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:36:14 crc kubenswrapper[4696]: I0318 15:36:14.552829 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:14Z is after 2026-02-23T05:33:13Z Mar 18 15:36:15 crc kubenswrapper[4696]: I0318 15:36:15.553611 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:15Z is after 2026-02-23T05:33:13Z Mar 18 15:36:16 crc kubenswrapper[4696]: I0318 15:36:16.338648 4696 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:36:16 crc kubenswrapper[4696]: I0318 15:36:16.338722 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 15:36:16 crc kubenswrapper[4696]: W0318 15:36:16.455142 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:16Z is after 2026-02-23T05:33:13Z Mar 18 15:36:16 crc kubenswrapper[4696]: E0318 15:36:16.455212 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:36:16 crc kubenswrapper[4696]: I0318 15:36:16.553584 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:16Z is after 2026-02-23T05:33:13Z Mar 18 15:36:16 crc kubenswrapper[4696]: W0318 15:36:16.720464 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:16Z is after 2026-02-23T05:33:13Z Mar 18 15:36:16 crc kubenswrapper[4696]: E0318 15:36:16.720545 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:36:17 crc kubenswrapper[4696]: I0318 15:36:17.365470 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:36:17 crc kubenswrapper[4696]: I0318 15:36:17.365639 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:17 crc kubenswrapper[4696]: I0318 15:36:17.366630 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:17 crc kubenswrapper[4696]: I0318 15:36:17.366664 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:17 crc kubenswrapper[4696]: I0318 15:36:17.366675 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:17 crc kubenswrapper[4696]: I0318 15:36:17.524563 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:36:17 crc kubenswrapper[4696]: I0318 15:36:17.524690 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:17 crc kubenswrapper[4696]: I0318 15:36:17.525619 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:17 crc kubenswrapper[4696]: I0318 15:36:17.525663 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:17 crc kubenswrapper[4696]: I0318 15:36:17.525678 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:17 crc kubenswrapper[4696]: I0318 15:36:17.526266 4696 scope.go:117] "RemoveContainer" containerID="923b709746ef2524ea57be800f1bae6198816247c09bfb01b6bfaac258f0f5d8" Mar 18 15:36:17 crc kubenswrapper[4696]: E0318 15:36:17.526489 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:36:17 crc kubenswrapper[4696]: I0318 15:36:17.528410 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:36:17 crc kubenswrapper[4696]: I0318 15:36:17.538776 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:36:17 crc kubenswrapper[4696]: I0318 15:36:17.552814 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:17Z is after 2026-02-23T05:33:13Z Mar 18 15:36:17 crc kubenswrapper[4696]: E0318 15:36:17.667826 4696 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 15:36:17 crc kubenswrapper[4696]: I0318 15:36:17.711916 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:17 crc kubenswrapper[4696]: I0318 15:36:17.713106 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:17 crc kubenswrapper[4696]: I0318 15:36:17.713138 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:17 crc kubenswrapper[4696]: I0318 15:36:17.713147 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:17 crc kubenswrapper[4696]: I0318 15:36:17.713165 4696 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:36:17 crc kubenswrapper[4696]: E0318 15:36:17.717487 4696 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 15:36:17 crc kubenswrapper[4696]: E0318 15:36:17.721195 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 15:36:17 crc kubenswrapper[4696]: I0318 15:36:17.734297 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:17 crc kubenswrapper[4696]: I0318 15:36:17.735414 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:17 crc kubenswrapper[4696]: I0318 15:36:17.735457 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:17 crc kubenswrapper[4696]: I0318 15:36:17.735468 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:17 crc kubenswrapper[4696]: I0318 15:36:17.736165 4696 scope.go:117] "RemoveContainer" containerID="923b709746ef2524ea57be800f1bae6198816247c09bfb01b6bfaac258f0f5d8" Mar 18 15:36:17 crc kubenswrapper[4696]: E0318 15:36:17.736310 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:36:18 crc kubenswrapper[4696]: I0318 15:36:18.554196 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:18 crc kubenswrapper[4696]: I0318 15:36:18.735709 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:18 crc kubenswrapper[4696]: I0318 15:36:18.736565 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:18 crc kubenswrapper[4696]: I0318 15:36:18.736604 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:18 crc kubenswrapper[4696]: I0318 15:36:18.736615 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:18 crc kubenswrapper[4696]: I0318 15:36:18.737076 4696 scope.go:117] "RemoveContainer" containerID="923b709746ef2524ea57be800f1bae6198816247c09bfb01b6bfaac258f0f5d8" Mar 18 15:36:18 crc kubenswrapper[4696]: E0318 15:36:18.737279 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:36:19 crc kubenswrapper[4696]: I0318 15:36:19.556444 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:20 crc kubenswrapper[4696]: W0318 15:36:20.001242 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:20 crc kubenswrapper[4696]: E0318 15:36:20.001813 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 15:36:20 crc kubenswrapper[4696]: I0318 15:36:20.035638 4696 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 15:36:20 crc kubenswrapper[4696]: I0318 15:36:20.047186 4696 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 15:36:20 crc kubenswrapper[4696]: I0318 15:36:20.554813 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:20 crc kubenswrapper[4696]: I0318 15:36:20.710703 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 18 15:36:20 crc kubenswrapper[4696]: I0318 15:36:20.710864 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:20 crc kubenswrapper[4696]: I0318 15:36:20.711967 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:20 crc kubenswrapper[4696]: I0318 15:36:20.712022 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:20 crc kubenswrapper[4696]: I0318 15:36:20.712035 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:20 crc kubenswrapper[4696]: I0318 15:36:20.721924 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 18 15:36:20 crc kubenswrapper[4696]: I0318 15:36:20.740724 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:20 crc kubenswrapper[4696]: I0318 15:36:20.741671 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:20 crc kubenswrapper[4696]: I0318 15:36:20.741716 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:20 crc kubenswrapper[4696]: I0318 15:36:20.741728 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:20 crc kubenswrapper[4696]: W0318 15:36:20.986562 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 18 15:36:20 crc kubenswrapper[4696]: E0318 15:36:20.986624 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.321244 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9804419a8a7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.547505831 +0000 UTC m=+0.553680057,LastTimestamp:2026-03-18 15:35:57.547505831 +0000 UTC m=+0.553680057,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.331533 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df98046602423 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.585679395 +0000 UTC m=+0.591853601,LastTimestamp:2026-03-18 15:35:57.585679395 +0000 UTC m=+0.591853601,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.335859 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9804660656a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.585696106 +0000 UTC m=+0.591870312,LastTimestamp:2026-03-18 15:35:57.585696106 +0000 UTC m=+0.591870312,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.340597 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df98046608b6c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.585705836 +0000 UTC m=+0.591880042,LastTimestamp:2026-03-18 15:35:57.585705836 +0000 UTC m=+0.591880042,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.344643 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9804afba5f7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.662979575 +0000 UTC m=+0.669153781,LastTimestamp:2026-03-18 15:35:57.662979575 +0000 UTC m=+0.669153781,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.349225 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df98046602423\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df98046602423 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.585679395 +0000 UTC m=+0.591853601,LastTimestamp:2026-03-18 15:35:57.698955713 +0000 UTC m=+0.705129919,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.353304 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9804660656a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9804660656a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.585696106 +0000 UTC m=+0.591870312,LastTimestamp:2026-03-18 15:35:57.698973004 +0000 UTC m=+0.705147210,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.356707 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df98046608b6c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df98046608b6c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.585705836 +0000 UTC m=+0.591880042,LastTimestamp:2026-03-18 15:35:57.698980814 +0000 UTC m=+0.705155020,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.360035 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df98046602423\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df98046602423 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.585679395 +0000 UTC m=+0.591853601,LastTimestamp:2026-03-18 15:35:57.699874243 +0000 UTC m=+0.706048449,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.364812 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9804660656a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9804660656a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.585696106 +0000 UTC m=+0.591870312,LastTimestamp:2026-03-18 15:35:57.699894314 +0000 UTC m=+0.706068520,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.368679 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df98046608b6c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df98046608b6c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.585705836 +0000 UTC m=+0.591880042,LastTimestamp:2026-03-18 15:35:57.699902204 +0000 UTC m=+0.706076400,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.372825 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df98046602423\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df98046602423 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.585679395 +0000 UTC m=+0.591853601,LastTimestamp:2026-03-18 15:35:57.700931959 +0000 UTC m=+0.707106165,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.376471 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9804660656a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9804660656a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.585696106 +0000 UTC m=+0.591870312,LastTimestamp:2026-03-18 15:35:57.70094561 +0000 UTC m=+0.707119816,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.379938 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df98046608b6c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df98046608b6c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.585705836 +0000 UTC m=+0.591880042,LastTimestamp:2026-03-18 15:35:57.70095557 +0000 UTC m=+0.707129776,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.384292 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df98046602423\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df98046602423 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.585679395 +0000 UTC m=+0.591853601,LastTimestamp:2026-03-18 15:35:57.70118174 +0000 UTC m=+0.707355946,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.388387 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9804660656a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9804660656a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.585696106 +0000 UTC m=+0.591870312,LastTimestamp:2026-03-18 15:35:57.70119391 +0000 UTC m=+0.707368116,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.392059 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df98046608b6c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df98046608b6c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.585705836 +0000 UTC m=+0.591880042,LastTimestamp:2026-03-18 15:35:57.701203881 +0000 UTC m=+0.707378087,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.395609 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df98046602423\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df98046602423 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.585679395 +0000 UTC m=+0.591853601,LastTimestamp:2026-03-18 15:35:57.701361048 +0000 UTC m=+0.707535254,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.400642 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9804660656a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9804660656a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.585696106 +0000 UTC m=+0.591870312,LastTimestamp:2026-03-18 15:35:57.701372608 +0000 UTC m=+0.707546814,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.403994 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df98046608b6c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df98046608b6c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.585705836 +0000 UTC m=+0.591880042,LastTimestamp:2026-03-18 15:35:57.701382019 +0000 UTC m=+0.707556225,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.407366 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df98046602423\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df98046602423 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.585679395 +0000 UTC m=+0.591853601,LastTimestamp:2026-03-18 15:35:57.701939213 +0000 UTC m=+0.708113419,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.411471 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9804660656a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9804660656a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.585696106 +0000 UTC m=+0.591870312,LastTimestamp:2026-03-18 15:35:57.701953853 +0000 UTC m=+0.708128059,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.414889 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df98046608b6c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df98046608b6c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.585705836 +0000 UTC m=+0.591880042,LastTimestamp:2026-03-18 15:35:57.701963664 +0000 UTC m=+0.708137870,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.418342 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df98046602423\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df98046602423 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.585679395 +0000 UTC m=+0.591853601,LastTimestamp:2026-03-18 15:35:57.703734701 +0000 UTC m=+0.709908907,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.422304 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9804660656a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9804660656a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:57.585696106 +0000 UTC m=+0.591870312,LastTimestamp:2026-03-18 15:35:57.703750421 +0000 UTC m=+0.709924627,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.427490 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df98064984bed openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:58.092676077 +0000 UTC m=+1.098850283,LastTimestamp:2026-03-18 15:35:58.092676077 +0000 UTC m=+1.098850283,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.432072 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df98064fd4522 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:58.099293474 +0000 UTC m=+1.105467680,LastTimestamp:2026-03-18 15:35:58.099293474 +0000 UTC m=+1.105467680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.436600 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df980650c93e5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:58.100296677 +0000 UTC m=+1.106470883,LastTimestamp:2026-03-18 15:35:58.100296677 +0000 UTC m=+1.106470883,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.441036 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df98066b9e233 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:58.128431667 +0000 UTC m=+1.134605873,LastTimestamp:2026-03-18 15:35:58.128431667 +0000 UTC m=+1.134605873,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.444977 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df980673fb9df openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:58.137203167 +0000 UTC m=+1.143377373,LastTimestamp:2026-03-18 15:35:58.137203167 +0000 UTC m=+1.143377373,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.449431 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df980864e10ec openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:58.658236652 +0000 UTC m=+1.664410878,LastTimestamp:2026-03-18 15:35:58.658236652 +0000 UTC m=+1.664410878,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.453020 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df980864f5d8c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:58.658321804 +0000 UTC m=+1.664496010,LastTimestamp:2026-03-18 15:35:58.658321804 +0000 UTC m=+1.664496010,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.457045 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df980865048b6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:58.658382006 +0000 UTC m=+1.664556232,LastTimestamp:2026-03-18 15:35:58.658382006 +0000 UTC m=+1.664556232,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.460532 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df98087451b94 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:58.674426772 +0000 UTC m=+1.680600978,LastTimestamp:2026-03-18 15:35:58.674426772 +0000 UTC m=+1.680600978,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.465874 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df98087570cb8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:58.675602616 +0000 UTC m=+1.681794142,LastTimestamp:2026-03-18 15:35:58.675602616 +0000 UTC m=+1.681794142,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.469792 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df98087605956 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:58.676212054 +0000 UTC m=+1.682386260,LastTimestamp:2026-03-18 15:35:58.676212054 +0000 UTC m=+1.682386260,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.473486 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df9808764175f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:58.676457311 +0000 UTC m=+1.682631507,LastTimestamp:2026-03-18 15:35:58.676457311 +0000 UTC m=+1.682631507,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.477321 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df98087686b69 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:58.676740969 +0000 UTC m=+1.682915175,LastTimestamp:2026-03-18 15:35:58.676740969 +0000 UTC m=+1.682915175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.481072 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df980876e5354 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:58.67712802 +0000 UTC m=+1.683302226,LastTimestamp:2026-03-18 15:35:58.67712802 +0000 UTC m=+1.683302226,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.484807 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df98088942e15 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:58.696386069 +0000 UTC m=+1.702560295,LastTimestamp:2026-03-18 15:35:58.696386069 +0000 UTC m=+1.702560295,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.488497 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df980993cb8dd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:58.975867101 +0000 UTC m=+1.982041297,LastTimestamp:2026-03-18 15:35:58.975867101 +0000 UTC m=+1.982041297,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.492074 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df98099b6e277 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:58.983873143 +0000 UTC m=+1.990047389,LastTimestamp:2026-03-18 15:35:58.983873143 +0000 UTC m=+1.990047389,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.495894 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df98099c9a1ab openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:58.985101739 +0000 UTC m=+1.991275945,LastTimestamp:2026-03-18 15:35:58.985101739 +0000 UTC m=+1.991275945,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.499501 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df980a6b9e6a5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:59.202174629 +0000 UTC m=+2.208348865,LastTimestamp:2026-03-18 15:35:59.202174629 +0000 UTC m=+2.208348865,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.503563 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df980a6c52106 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:59.20291047 +0000 UTC m=+2.209084716,LastTimestamp:2026-03-18 15:35:59.20291047 +0000 UTC m=+2.209084716,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.506342 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df980a79d7ce2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:59.217089762 +0000 UTC m=+2.223263998,LastTimestamp:2026-03-18 15:35:59.217089762 +0000 UTC m=+2.223263998,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.509881 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df980a7b2a3a6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:59.218475942 +0000 UTC m=+2.224650178,LastTimestamp:2026-03-18 15:35:59.218475942 +0000 UTC m=+2.224650178,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.513729 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df980b66b4e32 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:59.46545925 +0000 UTC m=+2.471633486,LastTimestamp:2026-03-18 15:35:59.46545925 +0000 UTC m=+2.471633486,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.517372 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df980b750d819 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:59.480502297 +0000 UTC m=+2.486676503,LastTimestamp:2026-03-18 15:35:59.480502297 +0000 UTC m=+2.486676503,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.520999 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df980bf31feac openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:59.612698284 +0000 UTC m=+2.618872490,LastTimestamp:2026-03-18 15:35:59.612698284 +0000 UTC m=+2.618872490,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.525792 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df980c09fcdff openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:59.636671999 +0000 UTC m=+2.642846205,LastTimestamp:2026-03-18 15:35:59.636671999 +0000 UTC m=+2.642846205,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.532961 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df980c105c3d9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:59.643354073 +0000 UTC m=+2.649528279,LastTimestamp:2026-03-18 15:35:59.643354073 +0000 UTC m=+2.649528279,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.539089 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df980c1615f5a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:59.649357658 +0000 UTC m=+2.655531864,LastTimestamp:2026-03-18 15:35:59.649357658 +0000 UTC m=+2.655531864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.543890 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df980cef5d6a2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:59.87719133 +0000 UTC m=+2.883365556,LastTimestamp:2026-03-18 15:35:59.87719133 +0000 UTC m=+2.883365556,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.548793 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df980ceff6df0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:59.877819888 +0000 UTC m=+2.883994094,LastTimestamp:2026-03-18 15:35:59.877819888 +0000 UTC m=+2.883994094,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: I0318 15:36:21.554446 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.554643 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df980cf051f67 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:59.878192999 +0000 UTC m=+2.884367205,LastTimestamp:2026-03-18 15:35:59.878192999 +0000 UTC m=+2.884367205,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.557959 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df980cf095067 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:59.878467687 +0000 UTC m=+2.884641893,LastTimestamp:2026-03-18 15:35:59.878467687 +0000 UTC m=+2.884641893,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.561947 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df980cfe7ac16 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:59.89304015 +0000 UTC m=+2.899214356,LastTimestamp:2026-03-18 15:35:59.89304015 +0000 UTC m=+2.899214356,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.565833 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df980d018e406 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:59.896265734 +0000 UTC m=+2.902439940,LastTimestamp:2026-03-18 15:35:59.896265734 +0000 UTC m=+2.902439940,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.569882 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df980d020a6ff openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:59.896774399 +0000 UTC m=+2.902948605,LastTimestamp:2026-03-18 15:35:59.896774399 +0000 UTC m=+2.902948605,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.574000 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df980d02920e7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:59.897329895 +0000 UTC m=+2.903504101,LastTimestamp:2026-03-18 15:35:59.897329895 +0000 UTC m=+2.903504101,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.578062 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df980d031ee53 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:59.897906771 +0000 UTC m=+2.904080977,LastTimestamp:2026-03-18 15:35:59.897906771 +0000 UTC m=+2.904080977,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.581755 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df980d06568df openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:59.901280479 +0000 UTC m=+2.907454685,LastTimestamp:2026-03-18 15:35:59.901280479 +0000 UTC m=+2.907454685,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.587205 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df980daceca0f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:00.075958799 +0000 UTC m=+3.082133005,LastTimestamp:2026-03-18 15:36:00.075958799 +0000 UTC m=+3.082133005,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.591767 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df980dacf573c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:00.07599494 +0000 UTC m=+3.082169146,LastTimestamp:2026-03-18 15:36:00.07599494 +0000 UTC m=+3.082169146,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.596847 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df980db741f64 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:00.086794084 +0000 UTC m=+3.092968280,LastTimestamp:2026-03-18 15:36:00.086794084 +0000 UTC m=+3.092968280,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.601230 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df980db89a94d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:00.088205645 +0000 UTC m=+3.094379851,LastTimestamp:2026-03-18 15:36:00.088205645 +0000 UTC m=+3.094379851,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.607343 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df980dbb0f3f3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:00.090780659 +0000 UTC m=+3.096954865,LastTimestamp:2026-03-18 15:36:00.090780659 +0000 UTC m=+3.096954865,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.613253 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df980dbbf067a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:00.091702906 +0000 UTC m=+3.097877102,LastTimestamp:2026-03-18 15:36:00.091702906 +0000 UTC m=+3.097877102,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.619150 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df980e68b6092 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:00.272867474 +0000 UTC m=+3.279041680,LastTimestamp:2026-03-18 15:36:00.272867474 +0000 UTC m=+3.279041680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.625651 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df980e6fcb3ce openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:00.28029435 +0000 UTC m=+3.286468586,LastTimestamp:2026-03-18 15:36:00.28029435 +0000 UTC m=+3.286468586,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.630340 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df980e74b6ac7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:00.285452999 +0000 UTC m=+3.291627205,LastTimestamp:2026-03-18 15:36:00.285452999 +0000 UTC m=+3.291627205,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.636021 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df980e7aff74b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:00.292042571 +0000 UTC m=+3.298216777,LastTimestamp:2026-03-18 15:36:00.292042571 +0000 UTC m=+3.298216777,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.640347 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df980e7c125db openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:00.293168603 +0000 UTC m=+3.299342809,LastTimestamp:2026-03-18 15:36:00.293168603 +0000 UTC m=+3.299342809,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.646584 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df980f324944f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:00.484234319 +0000 UTC m=+3.490408525,LastTimestamp:2026-03-18 15:36:00.484234319 +0000 UTC m=+3.490408525,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.651879 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df980f3e5d95e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:00.496900446 +0000 UTC m=+3.503074652,LastTimestamp:2026-03-18 15:36:00.496900446 +0000 UTC m=+3.503074652,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.656903 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df980f3fa721a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:00.498250266 +0000 UTC m=+3.504424472,LastTimestamp:2026-03-18 15:36:00.498250266 +0000 UTC m=+3.504424472,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.661740 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df980fca5785e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:00.643676254 +0000 UTC m=+3.649850460,LastTimestamp:2026-03-18 15:36:00.643676254 +0000 UTC m=+3.649850460,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.668995 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df980fda6cc6d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:00.660540525 +0000 UTC m=+3.666714731,LastTimestamp:2026-03-18 15:36:00.660540525 +0000 UTC m=+3.666714731,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.674415 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df980fdc77f50 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:00.662683472 +0000 UTC m=+3.668857698,LastTimestamp:2026-03-18 15:36:00.662683472 +0000 UTC m=+3.668857698,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.681087 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df981092459b1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:00.853318065 +0000 UTC m=+3.859492271,LastTimestamp:2026-03-18 15:36:00.853318065 +0000 UTC m=+3.859492271,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.685681 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df98109bcab27 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:00.863300391 +0000 UTC m=+3.869474597,LastTimestamp:2026-03-18 15:36:00.863300391 +0000 UTC m=+3.869474597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.692567 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df98139d8d7c0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:01.670453184 +0000 UTC m=+4.676627390,LastTimestamp:2026-03-18 15:36:01.670453184 +0000 UTC m=+4.676627390,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.702998 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df9814648ad54 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:01.879108948 +0000 UTC m=+4.885283154,LastTimestamp:2026-03-18 15:36:01.879108948 +0000 UTC m=+4.885283154,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.707204 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df981470737aa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:01.891596202 +0000 UTC m=+4.897770408,LastTimestamp:2026-03-18 15:36:01.891596202 +0000 UTC m=+4.897770408,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.711829 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df9814717b0d8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:01.8926758 +0000 UTC m=+4.898850046,LastTimestamp:2026-03-18 15:36:01.8926758 +0000 UTC m=+4.898850046,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.717906 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df98155d9448c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:02.140243084 +0000 UTC m=+5.146417290,LastTimestamp:2026-03-18 15:36:02.140243084 +0000 UTC m=+5.146417290,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.721408 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df981578681de openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:02.168373726 +0000 UTC m=+5.174547932,LastTimestamp:2026-03-18 15:36:02.168373726 +0000 UTC m=+5.174547932,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.725368 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df981579a21b6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:02.16965983 +0000 UTC m=+5.175834036,LastTimestamp:2026-03-18 15:36:02.16965983 +0000 UTC m=+5.175834036,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.729070 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df98161f0eb58 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:02.343119704 +0000 UTC m=+5.349293950,LastTimestamp:2026-03-18 15:36:02.343119704 +0000 UTC m=+5.349293950,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.732463 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df981626d8f9e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:02.351288222 +0000 UTC m=+5.357462428,LastTimestamp:2026-03-18 15:36:02.351288222 +0000 UTC m=+5.357462428,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.736052 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df9816279fec0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:02.352103104 +0000 UTC m=+5.358277310,LastTimestamp:2026-03-18 15:36:02.352103104 +0000 UTC m=+5.358277310,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.739436 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df9816d3ad082 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:02.532511874 +0000 UTC m=+5.538686080,LastTimestamp:2026-03-18 15:36:02.532511874 +0000 UTC m=+5.538686080,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.743397 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df9816e045f31 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:02.545721137 +0000 UTC m=+5.551895343,LastTimestamp:2026-03-18 15:36:02.545721137 +0000 UTC m=+5.551895343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.748001 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df9816e1eaf09 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:02.547445513 +0000 UTC m=+5.553619719,LastTimestamp:2026-03-18 15:36:02.547445513 +0000 UTC m=+5.553619719,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.752015 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df9817b5c0035 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:02.769567797 +0000 UTC m=+5.775742023,LastTimestamp:2026-03-18 15:36:02.769567797 +0000 UTC m=+5.775742023,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.757061 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df9817c3d284d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:02.784323661 +0000 UTC m=+5.790497877,LastTimestamp:2026-03-18 15:36:02.784323661 +0000 UTC m=+5.790497877,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.762819 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 15:36:21 crc kubenswrapper[4696]: &Event{ObjectMeta:{kube-controller-manager-crc.189df9825017422a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 18 15:36:21 crc kubenswrapper[4696]: body: Mar 18 15:36:21 crc kubenswrapper[4696]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:06.338609706 +0000 UTC m=+9.344783912,LastTimestamp:2026-03-18 15:36:06.338609706 +0000 UTC m=+9.344783912,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 15:36:21 crc kubenswrapper[4696]: > Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.766711 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df9825018050d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:06.338659597 +0000 UTC m=+9.344833793,LastTimestamp:2026-03-18 15:36:06.338659597 +0000 UTC m=+9.344833793,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.771743 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 15:36:21 crc kubenswrapper[4696]: &Event{ObjectMeta:{kube-apiserver-crc.189df98378df1638 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 15:36:21 crc kubenswrapper[4696]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 18 15:36:21 crc kubenswrapper[4696]: Mar 18 15:36:21 crc kubenswrapper[4696]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:11.317761592 +0000 UTC m=+14.323935788,LastTimestamp:2026-03-18 15:36:11.317761592 +0000 UTC m=+14.323935788,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 15:36:21 crc kubenswrapper[4696]: > Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.775617 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df98378dfa1f3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:11.317797363 +0000 UTC m=+14.323971569,LastTimestamp:2026-03-18 15:36:11.317797363 +0000 UTC m=+14.323971569,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.779373 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189df98378df1638\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 15:36:21 crc kubenswrapper[4696]: &Event{ObjectMeta:{kube-apiserver-crc.189df98378df1638 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 15:36:21 crc kubenswrapper[4696]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 18 15:36:21 crc kubenswrapper[4696]: Mar 18 15:36:21 crc kubenswrapper[4696]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:11.317761592 +0000 UTC m=+14.323935788,LastTimestamp:2026-03-18 15:36:11.326758262 +0000 UTC m=+14.332932488,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 15:36:21 crc kubenswrapper[4696]: > Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.782692 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189df98378dfa1f3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df98378dfa1f3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:11.317797363 +0000 UTC m=+14.323971569,LastTimestamp:2026-03-18 15:36:11.326816594 +0000 UTC m=+14.332990800,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.786400 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189df980f3fa721a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df980f3fa721a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:00.498250266 +0000 UTC m=+3.504424472,LastTimestamp:2026-03-18 15:36:11.713063782 +0000 UTC m=+14.719237988,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.790280 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189df980fca5785e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df980fca5785e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:00.643676254 +0000 UTC m=+3.649850460,LastTimestamp:2026-03-18 15:36:11.908769561 +0000 UTC m=+14.914943767,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.793808 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189df980fda6cc6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df980fda6cc6d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:00.660540525 +0000 UTC m=+3.666714731,LastTimestamp:2026-03-18 15:36:11.918307765 +0000 UTC m=+14.924481971,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.797482 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df9825017422a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 15:36:21 crc kubenswrapper[4696]: &Event{ObjectMeta:{kube-controller-manager-crc.189df9825017422a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 18 15:36:21 crc kubenswrapper[4696]: body: Mar 18 15:36:21 crc kubenswrapper[4696]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:06.338609706 +0000 UTC m=+9.344783912,LastTimestamp:2026-03-18 15:36:16.338698615 +0000 UTC m=+19.344872811,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 15:36:21 crc kubenswrapper[4696]: > Mar 18 15:36:21 crc kubenswrapper[4696]: E0318 15:36:21.800941 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df9825018050d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df9825018050d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:06.338659597 +0000 UTC m=+9.344833793,LastTimestamp:2026-03-18 15:36:16.338746717 +0000 UTC m=+19.344920923,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:22 crc kubenswrapper[4696]: I0318 15:36:22.554023 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:23 crc kubenswrapper[4696]: I0318 15:36:23.554855 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:24 crc kubenswrapper[4696]: W0318 15:36:24.431949 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 18 15:36:24 crc kubenswrapper[4696]: E0318 15:36:24.431996 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 15:36:24 crc kubenswrapper[4696]: I0318 15:36:24.554747 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:24 crc kubenswrapper[4696]: I0318 15:36:24.718158 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:24 crc kubenswrapper[4696]: I0318 15:36:24.720377 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:24 crc kubenswrapper[4696]: I0318 15:36:24.720409 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:24 crc kubenswrapper[4696]: I0318 15:36:24.720422 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:24 crc kubenswrapper[4696]: I0318 15:36:24.720447 4696 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:36:24 crc kubenswrapper[4696]: E0318 15:36:24.724212 4696 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 15:36:24 crc kubenswrapper[4696]: E0318 15:36:24.724325 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 15:36:25 crc kubenswrapper[4696]: I0318 15:36:25.555259 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:26 crc kubenswrapper[4696]: I0318 15:36:26.338770 4696 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:36:26 crc kubenswrapper[4696]: I0318 15:36:26.338871 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 15:36:26 crc kubenswrapper[4696]: I0318 15:36:26.339016 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:36:26 crc kubenswrapper[4696]: I0318 15:36:26.339237 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:26 crc kubenswrapper[4696]: I0318 15:36:26.341003 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:26 crc kubenswrapper[4696]: I0318 15:36:26.341078 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:26 crc kubenswrapper[4696]: I0318 15:36:26.341099 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:26 crc kubenswrapper[4696]: I0318 15:36:26.341995 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"5829b2c642f05cf5682b8e8cbe2079b16f6a4b046e773d51f36011ffcb50ce14"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 18 15:36:26 crc kubenswrapper[4696]: I0318 15:36:26.342283 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://5829b2c642f05cf5682b8e8cbe2079b16f6a4b046e773d51f36011ffcb50ce14" gracePeriod=30 Mar 18 15:36:26 crc kubenswrapper[4696]: E0318 15:36:26.347499 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 15:36:26 crc kubenswrapper[4696]: &Event{ObjectMeta:{kube-controller-manager-crc.189df986f832afb3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 15:36:26 crc kubenswrapper[4696]: body: Mar 18 15:36:26 crc kubenswrapper[4696]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:26.338848691 +0000 UTC m=+29.345022917,LastTimestamp:2026-03-18 15:36:26.338848691 +0000 UTC m=+29.345022917,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 15:36:26 crc kubenswrapper[4696]: > Mar 18 15:36:26 crc kubenswrapper[4696]: E0318 15:36:26.353036 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df986f8338abb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:26.338904763 +0000 UTC m=+29.345078989,LastTimestamp:2026-03-18 15:36:26.338904763 +0000 UTC m=+29.345078989,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:26 crc kubenswrapper[4696]: E0318 15:36:26.359616 4696 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df986f866aa31 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:26.342255153 +0000 UTC m=+29.348429429,LastTimestamp:2026-03-18 15:36:26.342255153 +0000 UTC m=+29.348429429,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:26 crc kubenswrapper[4696]: E0318 15:36:26.465455 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df980876e5354\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df980876e5354 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:58.67712802 +0000 UTC m=+1.683302226,LastTimestamp:2026-03-18 15:36:26.458653543 +0000 UTC m=+29.464827759,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:26 crc kubenswrapper[4696]: I0318 15:36:26.554991 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:26 crc kubenswrapper[4696]: E0318 15:36:26.668301 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df980993cb8dd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df980993cb8dd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:58.975867101 +0000 UTC m=+1.982041297,LastTimestamp:2026-03-18 15:36:26.663015147 +0000 UTC m=+29.669189353,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:26 crc kubenswrapper[4696]: E0318 15:36:26.681340 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df98099b6e277\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df98099b6e277 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:35:58.983873143 +0000 UTC m=+1.990047389,LastTimestamp:2026-03-18 15:36:26.676697235 +0000 UTC m=+29.682871441,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:26 crc kubenswrapper[4696]: I0318 15:36:26.761448 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 15:36:26 crc kubenswrapper[4696]: I0318 15:36:26.762023 4696 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="5829b2c642f05cf5682b8e8cbe2079b16f6a4b046e773d51f36011ffcb50ce14" exitCode=255 Mar 18 15:36:26 crc kubenswrapper[4696]: I0318 15:36:26.762090 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"5829b2c642f05cf5682b8e8cbe2079b16f6a4b046e773d51f36011ffcb50ce14"} Mar 18 15:36:26 crc kubenswrapper[4696]: I0318 15:36:26.762159 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9d50968d8853990e5727bc7800f1dc23ceaf69136e5dd91eb6baeb0493ce8723"} Mar 18 15:36:26 crc kubenswrapper[4696]: I0318 15:36:26.762313 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:26 crc kubenswrapper[4696]: I0318 15:36:26.763401 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:26 crc kubenswrapper[4696]: I0318 15:36:26.763477 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:26 crc kubenswrapper[4696]: I0318 15:36:26.763503 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:27 crc kubenswrapper[4696]: I0318 15:36:27.556413 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:27 crc kubenswrapper[4696]: E0318 15:36:27.668253 4696 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 15:36:28 crc kubenswrapper[4696]: W0318 15:36:28.354207 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 18 15:36:28 crc kubenswrapper[4696]: E0318 15:36:28.354283 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 15:36:28 crc kubenswrapper[4696]: I0318 15:36:28.555746 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:29 crc kubenswrapper[4696]: I0318 15:36:29.557260 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:30 crc kubenswrapper[4696]: I0318 15:36:30.557605 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:31 crc kubenswrapper[4696]: I0318 15:36:31.557632 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:31 crc kubenswrapper[4696]: I0318 15:36:31.724316 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:31 crc kubenswrapper[4696]: I0318 15:36:31.726261 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:31 crc kubenswrapper[4696]: I0318 15:36:31.726321 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:31 crc kubenswrapper[4696]: I0318 15:36:31.726333 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:31 crc kubenswrapper[4696]: I0318 15:36:31.726366 4696 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:36:31 crc kubenswrapper[4696]: E0318 15:36:31.735492 4696 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 15:36:31 crc kubenswrapper[4696]: E0318 15:36:31.735963 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 15:36:32 crc kubenswrapper[4696]: I0318 15:36:32.554385 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:33 crc kubenswrapper[4696]: I0318 15:36:33.338045 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:36:33 crc kubenswrapper[4696]: I0318 15:36:33.338285 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:33 crc kubenswrapper[4696]: I0318 15:36:33.340249 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:33 crc kubenswrapper[4696]: I0318 15:36:33.340301 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:33 crc kubenswrapper[4696]: I0318 15:36:33.340319 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:33 crc kubenswrapper[4696]: I0318 15:36:33.556746 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:33 crc kubenswrapper[4696]: I0318 15:36:33.596726 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:33 crc kubenswrapper[4696]: I0318 15:36:33.598571 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:33 crc kubenswrapper[4696]: I0318 15:36:33.598624 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:33 crc kubenswrapper[4696]: I0318 15:36:33.598633 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:33 crc kubenswrapper[4696]: I0318 15:36:33.599247 4696 scope.go:117] "RemoveContainer" containerID="923b709746ef2524ea57be800f1bae6198816247c09bfb01b6bfaac258f0f5d8" Mar 18 15:36:33 crc kubenswrapper[4696]: I0318 15:36:33.785104 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:36:33 crc kubenswrapper[4696]: I0318 15:36:33.785253 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:33 crc kubenswrapper[4696]: I0318 15:36:33.786430 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:33 crc kubenswrapper[4696]: I0318 15:36:33.786504 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:33 crc kubenswrapper[4696]: I0318 15:36:33.786544 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:34 crc kubenswrapper[4696]: I0318 15:36:34.556477 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:34 crc kubenswrapper[4696]: I0318 15:36:34.787436 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 15:36:34 crc kubenswrapper[4696]: I0318 15:36:34.788370 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 15:36:34 crc kubenswrapper[4696]: I0318 15:36:34.790409 4696 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="06ac02324b6506e3dc6a89431994943093a2f855e751d6b881835e9c54d885c7" exitCode=255 Mar 18 15:36:34 crc kubenswrapper[4696]: I0318 15:36:34.790466 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"06ac02324b6506e3dc6a89431994943093a2f855e751d6b881835e9c54d885c7"} Mar 18 15:36:34 crc kubenswrapper[4696]: I0318 15:36:34.790608 4696 scope.go:117] "RemoveContainer" containerID="923b709746ef2524ea57be800f1bae6198816247c09bfb01b6bfaac258f0f5d8" Mar 18 15:36:34 crc kubenswrapper[4696]: I0318 15:36:34.790840 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:34 crc kubenswrapper[4696]: I0318 15:36:34.792302 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:34 crc kubenswrapper[4696]: I0318 15:36:34.792348 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:34 crc kubenswrapper[4696]: I0318 15:36:34.792376 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:34 crc kubenswrapper[4696]: I0318 15:36:34.794104 4696 scope.go:117] "RemoveContainer" containerID="06ac02324b6506e3dc6a89431994943093a2f855e751d6b881835e9c54d885c7" Mar 18 15:36:34 crc kubenswrapper[4696]: E0318 15:36:34.794397 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:36:35 crc kubenswrapper[4696]: I0318 15:36:35.557159 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:35 crc kubenswrapper[4696]: I0318 15:36:35.794987 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 15:36:36 crc kubenswrapper[4696]: I0318 15:36:36.338882 4696 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:36:36 crc kubenswrapper[4696]: I0318 15:36:36.339003 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 15:36:36 crc kubenswrapper[4696]: E0318 15:36:36.343974 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df986f832afb3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 15:36:36 crc kubenswrapper[4696]: &Event{ObjectMeta:{kube-controller-manager-crc.189df986f832afb3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 15:36:36 crc kubenswrapper[4696]: body: Mar 18 15:36:36 crc kubenswrapper[4696]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:26.338848691 +0000 UTC m=+29.345022917,LastTimestamp:2026-03-18 15:36:36.338964628 +0000 UTC m=+39.345138874,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 15:36:36 crc kubenswrapper[4696]: > Mar 18 15:36:36 crc kubenswrapper[4696]: E0318 15:36:36.349884 4696 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df986f8338abb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df986f8338abb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:36:26.338904763 +0000 UTC m=+29.345078989,LastTimestamp:2026-03-18 15:36:36.33906131 +0000 UTC m=+39.345235556,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:36:36 crc kubenswrapper[4696]: I0318 15:36:36.555741 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:36 crc kubenswrapper[4696]: W0318 15:36:36.897694 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:36 crc kubenswrapper[4696]: E0318 15:36:36.897759 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 15:36:37 crc kubenswrapper[4696]: I0318 15:36:37.539135 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:36:37 crc kubenswrapper[4696]: I0318 15:36:37.539366 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:37 crc kubenswrapper[4696]: I0318 15:36:37.540934 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:37 crc kubenswrapper[4696]: I0318 15:36:37.540995 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:37 crc kubenswrapper[4696]: I0318 15:36:37.541020 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:37 crc kubenswrapper[4696]: I0318 15:36:37.542002 4696 scope.go:117] "RemoveContainer" containerID="06ac02324b6506e3dc6a89431994943093a2f855e751d6b881835e9c54d885c7" Mar 18 15:36:37 crc kubenswrapper[4696]: E0318 15:36:37.542366 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:36:37 crc kubenswrapper[4696]: I0318 15:36:37.555703 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:37 crc kubenswrapper[4696]: E0318 15:36:37.668808 4696 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 15:36:38 crc kubenswrapper[4696]: I0318 15:36:38.556011 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:38 crc kubenswrapper[4696]: I0318 15:36:38.736213 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:38 crc kubenswrapper[4696]: I0318 15:36:38.738316 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:38 crc kubenswrapper[4696]: I0318 15:36:38.738459 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:38 crc kubenswrapper[4696]: I0318 15:36:38.738494 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:38 crc kubenswrapper[4696]: I0318 15:36:38.738571 4696 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:36:38 crc kubenswrapper[4696]: E0318 15:36:38.744462 4696 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 15:36:38 crc kubenswrapper[4696]: E0318 15:36:38.744677 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 15:36:39 crc kubenswrapper[4696]: I0318 15:36:39.555047 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:40 crc kubenswrapper[4696]: I0318 15:36:40.554301 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:41 crc kubenswrapper[4696]: I0318 15:36:41.555797 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:42 crc kubenswrapper[4696]: I0318 15:36:42.555288 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:43 crc kubenswrapper[4696]: I0318 15:36:43.351780 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:36:43 crc kubenswrapper[4696]: I0318 15:36:43.351950 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:43 crc kubenswrapper[4696]: I0318 15:36:43.353259 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:43 crc kubenswrapper[4696]: I0318 15:36:43.353288 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:43 crc kubenswrapper[4696]: I0318 15:36:43.353335 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:43 crc kubenswrapper[4696]: I0318 15:36:43.356101 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:36:43 crc kubenswrapper[4696]: W0318 15:36:43.359035 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 18 15:36:43 crc kubenswrapper[4696]: E0318 15:36:43.359076 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 15:36:43 crc kubenswrapper[4696]: I0318 15:36:43.555815 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:43 crc kubenswrapper[4696]: I0318 15:36:43.818679 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:43 crc kubenswrapper[4696]: I0318 15:36:43.820134 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:43 crc kubenswrapper[4696]: I0318 15:36:43.820204 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:43 crc kubenswrapper[4696]: I0318 15:36:43.820225 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:44 crc kubenswrapper[4696]: I0318 15:36:44.036692 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:36:44 crc kubenswrapper[4696]: I0318 15:36:44.037031 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:44 crc kubenswrapper[4696]: I0318 15:36:44.038617 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:44 crc kubenswrapper[4696]: I0318 15:36:44.038891 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:44 crc kubenswrapper[4696]: I0318 15:36:44.039045 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:44 crc kubenswrapper[4696]: I0318 15:36:44.040326 4696 scope.go:117] "RemoveContainer" containerID="06ac02324b6506e3dc6a89431994943093a2f855e751d6b881835e9c54d885c7" Mar 18 15:36:44 crc kubenswrapper[4696]: E0318 15:36:44.040872 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:36:44 crc kubenswrapper[4696]: I0318 15:36:44.555763 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:44 crc kubenswrapper[4696]: W0318 15:36:44.821712 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 18 15:36:44 crc kubenswrapper[4696]: E0318 15:36:44.822011 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 15:36:45 crc kubenswrapper[4696]: I0318 15:36:45.555493 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:45 crc kubenswrapper[4696]: I0318 15:36:45.745772 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:45 crc kubenswrapper[4696]: I0318 15:36:45.746914 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:45 crc kubenswrapper[4696]: I0318 15:36:45.746941 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:45 crc kubenswrapper[4696]: I0318 15:36:45.746951 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:45 crc kubenswrapper[4696]: I0318 15:36:45.746972 4696 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:36:45 crc kubenswrapper[4696]: E0318 15:36:45.753421 4696 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 15:36:45 crc kubenswrapper[4696]: E0318 15:36:45.753468 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 15:36:46 crc kubenswrapper[4696]: I0318 15:36:46.531212 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:36:46 crc kubenswrapper[4696]: I0318 15:36:46.531400 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:46 crc kubenswrapper[4696]: I0318 15:36:46.532457 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:46 crc kubenswrapper[4696]: I0318 15:36:46.532514 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:46 crc kubenswrapper[4696]: I0318 15:36:46.532540 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:46 crc kubenswrapper[4696]: I0318 15:36:46.554722 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:47 crc kubenswrapper[4696]: I0318 15:36:47.554157 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:47 crc kubenswrapper[4696]: E0318 15:36:47.669788 4696 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 15:36:47 crc kubenswrapper[4696]: W0318 15:36:47.696328 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 18 15:36:47 crc kubenswrapper[4696]: E0318 15:36:47.696419 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 15:36:48 crc kubenswrapper[4696]: I0318 15:36:48.555318 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:49 crc kubenswrapper[4696]: I0318 15:36:49.554782 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:50 crc kubenswrapper[4696]: I0318 15:36:50.554538 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:51 crc kubenswrapper[4696]: I0318 15:36:51.557266 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:52 crc kubenswrapper[4696]: I0318 15:36:52.555808 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:52 crc kubenswrapper[4696]: I0318 15:36:52.754564 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:52 crc kubenswrapper[4696]: I0318 15:36:52.756030 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:52 crc kubenswrapper[4696]: I0318 15:36:52.756063 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:52 crc kubenswrapper[4696]: I0318 15:36:52.756074 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:52 crc kubenswrapper[4696]: I0318 15:36:52.756099 4696 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:36:52 crc kubenswrapper[4696]: E0318 15:36:52.761167 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 15:36:52 crc kubenswrapper[4696]: E0318 15:36:52.761598 4696 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 15:36:53 crc kubenswrapper[4696]: I0318 15:36:53.556294 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:54 crc kubenswrapper[4696]: I0318 15:36:54.551669 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:55 crc kubenswrapper[4696]: I0318 15:36:55.556023 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:56 crc kubenswrapper[4696]: I0318 15:36:56.557891 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:56 crc kubenswrapper[4696]: I0318 15:36:56.597035 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:56 crc kubenswrapper[4696]: I0318 15:36:56.598957 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:56 crc kubenswrapper[4696]: I0318 15:36:56.599035 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:56 crc kubenswrapper[4696]: I0318 15:36:56.599048 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:56 crc kubenswrapper[4696]: I0318 15:36:56.599966 4696 scope.go:117] "RemoveContainer" containerID="06ac02324b6506e3dc6a89431994943093a2f855e751d6b881835e9c54d885c7" Mar 18 15:36:56 crc kubenswrapper[4696]: I0318 15:36:56.858081 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 15:36:56 crc kubenswrapper[4696]: I0318 15:36:56.860369 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe"} Mar 18 15:36:56 crc kubenswrapper[4696]: I0318 15:36:56.860646 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:56 crc kubenswrapper[4696]: I0318 15:36:56.862920 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:56 crc kubenswrapper[4696]: I0318 15:36:56.862966 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:56 crc kubenswrapper[4696]: I0318 15:36:56.862984 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:57 crc kubenswrapper[4696]: I0318 15:36:57.557010 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:57 crc kubenswrapper[4696]: E0318 15:36:57.670925 4696 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 15:36:57 crc kubenswrapper[4696]: I0318 15:36:57.867102 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 15:36:57 crc kubenswrapper[4696]: I0318 15:36:57.868034 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 15:36:57 crc kubenswrapper[4696]: I0318 15:36:57.870552 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe"} Mar 18 15:36:57 crc kubenswrapper[4696]: I0318 15:36:57.870568 4696 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe" exitCode=255 Mar 18 15:36:57 crc kubenswrapper[4696]: I0318 15:36:57.870625 4696 scope.go:117] "RemoveContainer" containerID="06ac02324b6506e3dc6a89431994943093a2f855e751d6b881835e9c54d885c7" Mar 18 15:36:57 crc kubenswrapper[4696]: I0318 15:36:57.870780 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:57 crc kubenswrapper[4696]: I0318 15:36:57.872785 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:57 crc kubenswrapper[4696]: I0318 15:36:57.872808 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:57 crc kubenswrapper[4696]: I0318 15:36:57.872820 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:57 crc kubenswrapper[4696]: I0318 15:36:57.873419 4696 scope.go:117] "RemoveContainer" containerID="9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe" Mar 18 15:36:57 crc kubenswrapper[4696]: E0318 15:36:57.873658 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:36:58 crc kubenswrapper[4696]: I0318 15:36:58.569895 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:58 crc kubenswrapper[4696]: I0318 15:36:58.877009 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 15:36:59 crc kubenswrapper[4696]: I0318 15:36:59.556762 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:36:59 crc kubenswrapper[4696]: I0318 15:36:59.761716 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:36:59 crc kubenswrapper[4696]: I0318 15:36:59.763993 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:59 crc kubenswrapper[4696]: I0318 15:36:59.764063 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:59 crc kubenswrapper[4696]: I0318 15:36:59.764088 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:59 crc kubenswrapper[4696]: I0318 15:36:59.764177 4696 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:36:59 crc kubenswrapper[4696]: E0318 15:36:59.768599 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 15:36:59 crc kubenswrapper[4696]: E0318 15:36:59.769564 4696 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 15:37:00 crc kubenswrapper[4696]: I0318 15:37:00.553151 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:37:01 crc kubenswrapper[4696]: I0318 15:37:01.554643 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:37:02 crc kubenswrapper[4696]: I0318 15:37:02.060412 4696 csr.go:261] certificate signing request csr-rrv5g is approved, waiting to be issued Mar 18 15:37:02 crc kubenswrapper[4696]: I0318 15:37:02.079384 4696 csr.go:257] certificate signing request csr-rrv5g is issued Mar 18 15:37:02 crc kubenswrapper[4696]: I0318 15:37:02.162513 4696 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 18 15:37:02 crc kubenswrapper[4696]: I0318 15:37:02.419305 4696 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 15:37:03 crc kubenswrapper[4696]: I0318 15:37:03.082580 4696 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-22 14:26:49.275477153 +0000 UTC Mar 18 15:37:03 crc kubenswrapper[4696]: I0318 15:37:03.083368 4696 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 5974h49m46.192115317s for next certificate rotation Mar 18 15:37:04 crc kubenswrapper[4696]: I0318 15:37:04.035736 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:37:04 crc kubenswrapper[4696]: I0318 15:37:04.036006 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:04 crc kubenswrapper[4696]: I0318 15:37:04.037704 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:04 crc kubenswrapper[4696]: I0318 15:37:04.038171 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:04 crc kubenswrapper[4696]: I0318 15:37:04.038576 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:04 crc kubenswrapper[4696]: I0318 15:37:04.039635 4696 scope.go:117] "RemoveContainer" containerID="9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe" Mar 18 15:37:04 crc kubenswrapper[4696]: E0318 15:37:04.040002 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.770339 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.772172 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.772224 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.772238 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.772386 4696 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.783349 4696 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.784194 4696 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 18 15:37:06 crc kubenswrapper[4696]: E0318 15:37:06.784229 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.788492 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.788592 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.788613 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.788640 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.788658 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:06Z","lastTransitionTime":"2026-03-18T15:37:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:06 crc kubenswrapper[4696]: E0318 15:37:06.805791 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.814476 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.814513 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.814540 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.814557 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.814570 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:06Z","lastTransitionTime":"2026-03-18T15:37:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:06 crc kubenswrapper[4696]: E0318 15:37:06.830386 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.839156 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.839212 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.839220 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.839239 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.839251 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:06Z","lastTransitionTime":"2026-03-18T15:37:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:06 crc kubenswrapper[4696]: E0318 15:37:06.852692 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.861279 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.861348 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.861358 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.861382 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:06 crc kubenswrapper[4696]: I0318 15:37:06.861394 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:06Z","lastTransitionTime":"2026-03-18T15:37:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:06 crc kubenswrapper[4696]: E0318 15:37:06.873038 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:37:06 crc kubenswrapper[4696]: E0318 15:37:06.873175 4696 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:37:06 crc kubenswrapper[4696]: E0318 15:37:06.873201 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:06 crc kubenswrapper[4696]: E0318 15:37:06.973479 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:07 crc kubenswrapper[4696]: E0318 15:37:07.074680 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:07 crc kubenswrapper[4696]: E0318 15:37:07.175448 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:07 crc kubenswrapper[4696]: E0318 15:37:07.276586 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:07 crc kubenswrapper[4696]: E0318 15:37:07.377331 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:07 crc kubenswrapper[4696]: E0318 15:37:07.478090 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:07 crc kubenswrapper[4696]: I0318 15:37:07.539590 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:37:07 crc kubenswrapper[4696]: I0318 15:37:07.539782 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:07 crc kubenswrapper[4696]: I0318 15:37:07.541144 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:07 crc kubenswrapper[4696]: I0318 15:37:07.541198 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:07 crc kubenswrapper[4696]: I0318 15:37:07.541210 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:07 crc kubenswrapper[4696]: I0318 15:37:07.542054 4696 scope.go:117] "RemoveContainer" containerID="9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe" Mar 18 15:37:07 crc kubenswrapper[4696]: E0318 15:37:07.542290 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:37:07 crc kubenswrapper[4696]: E0318 15:37:07.579220 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:07 crc kubenswrapper[4696]: E0318 15:37:07.673705 4696 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 15:37:07 crc kubenswrapper[4696]: E0318 15:37:07.679461 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:07 crc kubenswrapper[4696]: E0318 15:37:07.779642 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:07 crc kubenswrapper[4696]: E0318 15:37:07.880101 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:07 crc kubenswrapper[4696]: E0318 15:37:07.980417 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:08 crc kubenswrapper[4696]: E0318 15:37:08.081670 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:08 crc kubenswrapper[4696]: E0318 15:37:08.182632 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:08 crc kubenswrapper[4696]: E0318 15:37:08.283492 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:08 crc kubenswrapper[4696]: E0318 15:37:08.384503 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:08 crc kubenswrapper[4696]: E0318 15:37:08.485575 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:08 crc kubenswrapper[4696]: E0318 15:37:08.586315 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:08 crc kubenswrapper[4696]: E0318 15:37:08.884420 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:08 crc kubenswrapper[4696]: E0318 15:37:08.984713 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:09 crc kubenswrapper[4696]: E0318 15:37:09.085766 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:09 crc kubenswrapper[4696]: E0318 15:37:09.186546 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:09 crc kubenswrapper[4696]: E0318 15:37:09.286920 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:09 crc kubenswrapper[4696]: E0318 15:37:09.387849 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:09 crc kubenswrapper[4696]: E0318 15:37:09.488269 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:09 crc kubenswrapper[4696]: E0318 15:37:09.589443 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:09 crc kubenswrapper[4696]: I0318 15:37:09.596964 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:09 crc kubenswrapper[4696]: I0318 15:37:09.599013 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:09 crc kubenswrapper[4696]: I0318 15:37:09.599070 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:09 crc kubenswrapper[4696]: I0318 15:37:09.599085 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:09 crc kubenswrapper[4696]: E0318 15:37:09.689730 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:09 crc kubenswrapper[4696]: E0318 15:37:09.790933 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:09 crc kubenswrapper[4696]: E0318 15:37:09.891369 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:09 crc kubenswrapper[4696]: E0318 15:37:09.992606 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:10 crc kubenswrapper[4696]: E0318 15:37:10.092849 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:10 crc kubenswrapper[4696]: E0318 15:37:10.194018 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:10 crc kubenswrapper[4696]: E0318 15:37:10.295162 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:10 crc kubenswrapper[4696]: E0318 15:37:10.396268 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:10 crc kubenswrapper[4696]: E0318 15:37:10.496835 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:10 crc kubenswrapper[4696]: E0318 15:37:10.597049 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:10 crc kubenswrapper[4696]: E0318 15:37:10.698275 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:10 crc kubenswrapper[4696]: E0318 15:37:10.798957 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:10 crc kubenswrapper[4696]: E0318 15:37:10.899879 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:11 crc kubenswrapper[4696]: E0318 15:37:11.000949 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:11 crc kubenswrapper[4696]: E0318 15:37:11.101953 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:11 crc kubenswrapper[4696]: E0318 15:37:11.202636 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:11 crc kubenswrapper[4696]: E0318 15:37:11.303000 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:11 crc kubenswrapper[4696]: E0318 15:37:11.403471 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:11 crc kubenswrapper[4696]: E0318 15:37:11.504821 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:11 crc kubenswrapper[4696]: E0318 15:37:11.605353 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:11 crc kubenswrapper[4696]: E0318 15:37:11.705882 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:11 crc kubenswrapper[4696]: E0318 15:37:11.806323 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:11 crc kubenswrapper[4696]: E0318 15:37:11.906778 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:12 crc kubenswrapper[4696]: E0318 15:37:12.007416 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:12 crc kubenswrapper[4696]: I0318 15:37:12.048602 4696 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 15:37:12 crc kubenswrapper[4696]: E0318 15:37:12.108004 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:12 crc kubenswrapper[4696]: E0318 15:37:12.209238 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:12 crc kubenswrapper[4696]: E0318 15:37:12.309922 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:12 crc kubenswrapper[4696]: E0318 15:37:12.410628 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:12 crc kubenswrapper[4696]: E0318 15:37:12.511722 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:12 crc kubenswrapper[4696]: I0318 15:37:12.611001 4696 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 15:37:12 crc kubenswrapper[4696]: E0318 15:37:12.611974 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:12 crc kubenswrapper[4696]: E0318 15:37:12.712515 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:12 crc kubenswrapper[4696]: E0318 15:37:12.813296 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:12 crc kubenswrapper[4696]: E0318 15:37:12.914324 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:13 crc kubenswrapper[4696]: E0318 15:37:13.015313 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:13 crc kubenswrapper[4696]: E0318 15:37:13.116406 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:13 crc kubenswrapper[4696]: E0318 15:37:13.217358 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:13 crc kubenswrapper[4696]: E0318 15:37:13.318313 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:13 crc kubenswrapper[4696]: E0318 15:37:13.418775 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:13 crc kubenswrapper[4696]: E0318 15:37:13.519208 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:13 crc kubenswrapper[4696]: E0318 15:37:13.620037 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:13 crc kubenswrapper[4696]: E0318 15:37:13.720614 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:13 crc kubenswrapper[4696]: E0318 15:37:13.821644 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:13 crc kubenswrapper[4696]: E0318 15:37:13.922164 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:14 crc kubenswrapper[4696]: E0318 15:37:14.023382 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:14 crc kubenswrapper[4696]: E0318 15:37:14.124125 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:14 crc kubenswrapper[4696]: E0318 15:37:14.225165 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:14 crc kubenswrapper[4696]: E0318 15:37:14.326260 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:14 crc kubenswrapper[4696]: E0318 15:37:14.427211 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:14 crc kubenswrapper[4696]: E0318 15:37:14.528445 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:14 crc kubenswrapper[4696]: E0318 15:37:14.629227 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:14 crc kubenswrapper[4696]: E0318 15:37:14.730378 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:14 crc kubenswrapper[4696]: E0318 15:37:14.831007 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:14 crc kubenswrapper[4696]: E0318 15:37:14.932016 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:15 crc kubenswrapper[4696]: E0318 15:37:15.033289 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:15 crc kubenswrapper[4696]: E0318 15:37:15.134007 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:15 crc kubenswrapper[4696]: E0318 15:37:15.235095 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:15 crc kubenswrapper[4696]: E0318 15:37:15.335970 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:15 crc kubenswrapper[4696]: E0318 15:37:15.436107 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:15 crc kubenswrapper[4696]: E0318 15:37:15.536784 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:15 crc kubenswrapper[4696]: E0318 15:37:15.637711 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:15 crc kubenswrapper[4696]: E0318 15:37:15.738690 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:15 crc kubenswrapper[4696]: E0318 15:37:15.839741 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:15 crc kubenswrapper[4696]: E0318 15:37:15.940775 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:16 crc kubenswrapper[4696]: E0318 15:37:16.041144 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.106201 4696 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.143793 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.143836 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.143847 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.143871 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.143886 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:16Z","lastTransitionTime":"2026-03-18T15:37:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.246755 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.247332 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.247506 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.247702 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.248165 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:16Z","lastTransitionTime":"2026-03-18T15:37:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.351544 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.351869 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.351952 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.352053 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.352145 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:16Z","lastTransitionTime":"2026-03-18T15:37:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.455770 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.456141 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.456249 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.456393 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.456487 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:16Z","lastTransitionTime":"2026-03-18T15:37:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.560430 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.560509 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.560568 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.560598 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.560617 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:16Z","lastTransitionTime":"2026-03-18T15:37:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.663789 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.663867 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.663881 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.663904 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.663918 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:16Z","lastTransitionTime":"2026-03-18T15:37:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.767228 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.767285 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.767298 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.767319 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.767333 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:16Z","lastTransitionTime":"2026-03-18T15:37:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.871793 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.872259 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.872409 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.872579 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.872754 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:16Z","lastTransitionTime":"2026-03-18T15:37:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.889331 4696 apiserver.go:52] "Watching apiserver" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.894397 4696 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.895083 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.895875 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.896621 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.896832 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.896877 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:37:16 crc kubenswrapper[4696]: E0318 15:37:16.896888 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.897250 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:16 crc kubenswrapper[4696]: E0318 15:37:16.897619 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.897942 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:16 crc kubenswrapper[4696]: E0318 15:37:16.898192 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.899806 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.899872 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.900030 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.900162 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.900318 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.900567 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.900693 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.901301 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.903600 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.910464 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.910559 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.910585 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.910635 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.910662 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:16Z","lastTransitionTime":"2026-03-18T15:37:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:16 crc kubenswrapper[4696]: E0318 15:37:16.924491 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.931324 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.931388 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.931407 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.931433 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.931453 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:16Z","lastTransitionTime":"2026-03-18T15:37:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.933980 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:37:16 crc kubenswrapper[4696]: E0318 15:37:16.943940 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.949434 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.949560 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.949583 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.949613 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.949639 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:16Z","lastTransitionTime":"2026-03-18T15:37:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.954497 4696 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.954851 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:37:16 crc kubenswrapper[4696]: E0318 15:37:16.963298 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.967143 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.967644 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.967669 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.967694 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.967715 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.967731 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:16Z","lastTransitionTime":"2026-03-18T15:37:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:16 crc kubenswrapper[4696]: E0318 15:37:16.977915 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.978497 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.981652 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.981704 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.981718 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.981741 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.981757 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:16Z","lastTransitionTime":"2026-03-18T15:37:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.988920 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:37:16 crc kubenswrapper[4696]: E0318 15:37:16.991864 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:37:16 crc kubenswrapper[4696]: E0318 15:37:16.992028 4696 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.993770 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.993815 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.993828 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.993847 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:16 crc kubenswrapper[4696]: I0318 15:37:16.993862 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:16Z","lastTransitionTime":"2026-03-18T15:37:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.000063 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.013744 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.031045 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.042737 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.042799 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.042829 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.042857 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.042882 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.042907 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.042929 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.042957 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.042979 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043001 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043027 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043051 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043078 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043101 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043126 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043148 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043172 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043198 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043224 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043251 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043278 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043304 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043327 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043353 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043598 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043627 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043624 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043651 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043764 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043814 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043856 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043898 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043939 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.043979 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.044016 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.044054 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.044102 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.044147 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.044187 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.044298 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.044384 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.044694 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.044752 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.044806 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.044861 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.044933 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.045008 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.044712 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.045057 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.045133 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.045250 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.045319 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.045343 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.045432 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.045422 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.045463 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.045698 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.045831 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.045853 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.046240 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.046330 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.046382 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.046556 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.045470 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.046855 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.046943 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.046993 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047039 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047081 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047119 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047155 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047198 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047241 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047278 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047315 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047362 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047405 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047446 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047485 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047551 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047592 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047631 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047668 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047703 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047741 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047775 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047813 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047858 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047897 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047936 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047931 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047976 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.047998 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048017 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048054 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048087 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048119 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048155 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048202 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048239 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048270 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048314 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048347 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048355 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048387 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048422 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048461 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048497 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048576 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048613 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048647 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048677 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048712 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048739 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048769 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048799 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048829 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048861 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048892 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048923 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048954 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.048984 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.049022 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.049061 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.049094 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.049123 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.049130 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.049156 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.049191 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.049223 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.049229 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.049257 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.049375 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.049401 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.049552 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.049626 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.049824 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.049895 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.049964 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.050595 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.050670 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.050731 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.050787 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.050814 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.050843 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.050855 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.050874 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.050931 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.050988 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.051028 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.051203 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.051322 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.051387 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.051432 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.051471 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.051537 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.051578 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.051615 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.051655 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.051694 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.051732 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.051774 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.051812 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.051850 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.051889 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.051927 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.051962 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.051998 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052032 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052070 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052105 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052146 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052182 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052220 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052257 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052292 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052326 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052363 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052402 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052436 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052496 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052564 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052611 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052645 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052682 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052717 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052749 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052782 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052814 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052852 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052885 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052922 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052957 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.051238 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.051566 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.053052 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.051563 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052940 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.051844 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052031 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052087 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052153 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052215 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052387 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052446 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.053297 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.053813 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052689 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.052906 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.053853 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.053027 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:17.552975142 +0000 UTC m=+80.559149518 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.054022 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.054106 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.054142 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.053785 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.054223 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.054293 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.054367 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.054411 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.054453 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.054495 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.054590 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.054646 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.054697 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.054735 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.054774 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.054816 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.054869 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.054883 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.054928 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.054963 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.054990 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.055046 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.055092 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.055133 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.055179 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.055199 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.055237 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.055302 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.055355 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.055409 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.055425 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.055473 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.055613 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.055692 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.055755 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.055809 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.055868 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.055932 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056000 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056040 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056089 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056142 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056200 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056265 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056333 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056385 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056398 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056559 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056367 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056581 4696 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056601 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056616 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056629 4696 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056645 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056658 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056669 4696 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056752 4696 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056766 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056779 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056790 4696 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056808 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056821 4696 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056832 4696 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056844 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056855 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056865 4696 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056875 4696 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056888 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056898 4696 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056909 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056920 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.056923 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.057022 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.057040 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.057051 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.057061 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.057072 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.057083 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.057342 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.057388 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.057426 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.057434 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.057463 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.057566 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.057574 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.057669 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.057890 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.058010 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.057887 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.053645 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.057857 4696 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.053281 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.060159 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.060213 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.060313 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.060347 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.060186 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.060589 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.060582 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.061028 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.061096 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.061306 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.061368 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.061600 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.062400 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.062550 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.062632 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.062673 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.062794 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.063024 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.063020 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.063076 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.063166 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.063290 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.063592 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.063624 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.063661 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.063686 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.063770 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.063825 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.063858 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.063933 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.064133 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.064338 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.064422 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.064742 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.065328 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.065733 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.065831 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.065887 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.065951 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.066253 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.066618 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.066631 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.066838 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.065347 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.065357 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.065418 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.066953 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.066971 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.067220 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.067295 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.067375 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.067452 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.067557 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.067823 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.067860 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.067879 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.068158 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.068203 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.069006 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.068970 4696 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.069029 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.069074 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.069045 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.068455 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.069302 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.069369 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.069409 4696 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.069617 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.069878 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.070047 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.070613 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.070010 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.070853 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.070860 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.071194 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.071388 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:17.571368099 +0000 UTC m=+80.577542305 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.071547 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.073107 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.073106 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.073441 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.073685 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.073787 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.072729 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:17.571747357 +0000 UTC m=+80.577921603 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.074352 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.074450 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.074609 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.075998 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.076084 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.084128 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.084176 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.084208 4696 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.084374 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:17.584336896 +0000 UTC m=+80.590511312 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.085141 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.085180 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.085191 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.085903 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.086939 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.089107 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.089641 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.090113 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.091111 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.091320 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.092019 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.092146 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.092828 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.093247 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.093269 4696 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.093356 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:17.593325455 +0000 UTC m=+80.599499691 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.096599 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.099406 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.099552 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.099808 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.099845 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.100259 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.100328 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.100934 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.101014 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.101067 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.101607 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.101647 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.101658 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.102008 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.101677 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.102093 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.102554 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.103015 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.103366 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.103480 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.103607 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:17Z","lastTransitionTime":"2026-03-18T15:37:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.103801 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.104069 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.104456 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.104343 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.105581 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.105670 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.106115 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.106135 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.106284 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.107038 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.107563 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.108117 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.108731 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.108790 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.108874 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.109027 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.109244 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.109263 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.109513 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.109849 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.110086 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.110148 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.112574 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.118311 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.121670 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.138825 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.144114 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.158350 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.158621 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.158477 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.158690 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.158908 4696 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.158975 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.159035 4696 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.159100 4696 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.159159 4696 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.159220 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.159281 4696 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.159334 4696 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.159391 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.159452 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.159511 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.159597 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.159658 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.159822 4696 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.159898 4696 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.159963 4696 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.160021 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.160156 4696 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.160217 4696 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.160278 4696 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.160334 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.160390 4696 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.160448 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.160514 4696 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.160614 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.160674 4696 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.160726 4696 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.160788 4696 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.160849 4696 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.160906 4696 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.160963 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.161020 4696 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.161082 4696 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.161145 4696 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.161308 4696 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.161366 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.161427 4696 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.161486 4696 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.161573 4696 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.161648 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.161710 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.161766 4696 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.161831 4696 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.161889 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.161947 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.162007 4696 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.162070 4696 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.162133 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.162190 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.162248 4696 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.162304 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.162363 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.162422 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.162481 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.162565 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.162640 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.162701 4696 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.162761 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.162819 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.162876 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.162934 4696 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.162991 4696 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.163055 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.163114 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.163172 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.163235 4696 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.163316 4696 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.163500 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.163597 4696 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.163671 4696 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.163731 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.163786 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.163852 4696 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.163913 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.163971 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.164028 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.164085 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.164148 4696 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.164206 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.164264 4696 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.164320 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.164376 4696 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.164433 4696 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.164490 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.164575 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.164643 4696 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.164700 4696 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.164753 4696 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.164805 4696 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.164862 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.164922 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.164981 4696 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.165038 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.165096 4696 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.165151 4696 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.165210 4696 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.165270 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.165329 4696 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.165387 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.165444 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.165501 4696 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.165599 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.165667 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.165724 4696 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.165783 4696 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.165847 4696 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.165905 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.166026 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.166085 4696 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.166141 4696 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.166199 4696 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.166256 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.166358 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.166628 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.166692 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.166749 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.166806 4696 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.166863 4696 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.166923 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.166982 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.167037 4696 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.167094 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.167150 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.167210 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.167277 4696 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.167335 4696 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.167391 4696 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.167448 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.167505 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.167582 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.167661 4696 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.167721 4696 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.167781 4696 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.167838 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.167896 4696 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.167957 4696 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.168024 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.168082 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.168135 4696 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.168194 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.168253 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.168313 4696 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.168370 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.168430 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.168488 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.168573 4696 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.168638 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.168698 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.168756 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.168816 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.168872 4696 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.168930 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.169065 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.169120 4696 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.169172 4696 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.169228 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.169284 4696 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.169337 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.169401 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.169457 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.169513 4696 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.169594 4696 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.169652 4696 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.169704 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.169761 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.206735 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.206775 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.206786 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.206804 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.206815 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:17Z","lastTransitionTime":"2026-03-18T15:37:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.214973 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.224083 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.234022 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:37:17 crc kubenswrapper[4696]: W0318 15:37:17.246050 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-323aece5ca1919d72b31747470a05882492fb300deb1c1f4d19042c0425ed9cb WatchSource:0}: Error finding container 323aece5ca1919d72b31747470a05882492fb300deb1c1f4d19042c0425ed9cb: Status 404 returned error can't find the container with id 323aece5ca1919d72b31747470a05882492fb300deb1c1f4d19042c0425ed9cb Mar 18 15:37:17 crc kubenswrapper[4696]: W0318 15:37:17.248282 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-8e689925a8a13c2bfc18555b8f5bd159582c7643b13c947083b0aa218c519966 WatchSource:0}: Error finding container 8e689925a8a13c2bfc18555b8f5bd159582c7643b13c947083b0aa218c519966: Status 404 returned error can't find the container with id 8e689925a8a13c2bfc18555b8f5bd159582c7643b13c947083b0aa218c519966 Mar 18 15:37:17 crc kubenswrapper[4696]: W0318 15:37:17.254736 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-ae72e228550f7622a6ea12e58ba5814230da01d242a1d5b27dc408c61064f07e WatchSource:0}: Error finding container ae72e228550f7622a6ea12e58ba5814230da01d242a1d5b27dc408c61064f07e: Status 404 returned error can't find the container with id ae72e228550f7622a6ea12e58ba5814230da01d242a1d5b27dc408c61064f07e Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.326314 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.326354 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.326363 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.326378 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.326387 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:17Z","lastTransitionTime":"2026-03-18T15:37:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.432312 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.432377 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.432399 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.432426 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.432446 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:17Z","lastTransitionTime":"2026-03-18T15:37:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.536514 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.536579 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.536592 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.536616 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.536631 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:17Z","lastTransitionTime":"2026-03-18T15:37:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.580791 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.580900 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.580925 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.581039 4696 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.581090 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:18.581077379 +0000 UTC m=+81.587251585 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.581452 4696 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.581592 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:18.58155477 +0000 UTC m=+81.587729016 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.581748 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:18.581734844 +0000 UTC m=+81.587909050 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.603026 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.603755 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.604844 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.605456 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.606388 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.606940 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.607503 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.609182 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.609229 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.610012 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.611237 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.611980 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.613043 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.613556 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.614040 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.614935 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.615417 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.616348 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.616746 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.617265 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.618217 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.618651 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.619554 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.619965 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.621001 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.621390 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.621566 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.621995 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.623888 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.624783 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.625741 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.626806 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.627707 4696 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.627906 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.631178 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.632672 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.634109 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.635316 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.636041 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.637182 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.638053 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.639055 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.640154 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.640869 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.641912 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.641957 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.641973 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.641995 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.642008 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:17Z","lastTransitionTime":"2026-03-18T15:37:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.644409 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.645380 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.645390 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.646780 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.647448 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.648851 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.649656 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.651210 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.652617 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.654075 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.656471 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.657893 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.657937 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.659831 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.662492 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.669318 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.681314 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.681369 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.681537 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.681575 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.681592 4696 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.681538 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.681668 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:18.681640335 +0000 UTC m=+81.687814541 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.681673 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.681696 4696 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:17 crc kubenswrapper[4696]: E0318 15:37:17.681748 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:18.681731077 +0000 UTC m=+81.687905303 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.744881 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.744931 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.744940 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.744957 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.744971 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:17Z","lastTransitionTime":"2026-03-18T15:37:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.847615 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.847675 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.847729 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.847760 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.847776 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:17Z","lastTransitionTime":"2026-03-18T15:37:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.940181 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac"} Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.940234 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f"} Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.940247 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ae72e228550f7622a6ea12e58ba5814230da01d242a1d5b27dc408c61064f07e"} Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.941714 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8e689925a8a13c2bfc18555b8f5bd159582c7643b13c947083b0aa218c519966"} Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.943080 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91"} Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.943115 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"323aece5ca1919d72b31747470a05882492fb300deb1c1f4d19042c0425ed9cb"} Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.950741 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.950818 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.950834 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.950855 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.950870 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:17Z","lastTransitionTime":"2026-03-18T15:37:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.965454 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:17 crc kubenswrapper[4696]: I0318 15:37:17.990674 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.009407 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.027304 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.045483 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.053464 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.053648 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.053687 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.053710 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.053973 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:18Z","lastTransitionTime":"2026-03-18T15:37:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.063275 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.082639 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.099155 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.117270 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.135631 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.151881 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.157555 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.157625 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.157645 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.157679 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.157698 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:18Z","lastTransitionTime":"2026-03-18T15:37:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.172004 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.261898 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.261949 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.261958 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.261976 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.261987 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:18Z","lastTransitionTime":"2026-03-18T15:37:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.365918 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.365977 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.366000 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.366032 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.366056 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:18Z","lastTransitionTime":"2026-03-18T15:37:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.470600 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.470688 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.470710 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.470740 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.470763 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:18Z","lastTransitionTime":"2026-03-18T15:37:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.574467 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.574548 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.574565 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.574590 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.574610 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:18Z","lastTransitionTime":"2026-03-18T15:37:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.591670 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.591764 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.591803 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:18 crc kubenswrapper[4696]: E0318 15:37:18.591988 4696 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:37:18 crc kubenswrapper[4696]: E0318 15:37:18.592064 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:20.592041172 +0000 UTC m=+83.598215418 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:37:18 crc kubenswrapper[4696]: E0318 15:37:18.592577 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:20.592558573 +0000 UTC m=+83.598732809 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:18 crc kubenswrapper[4696]: E0318 15:37:18.592649 4696 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:37:18 crc kubenswrapper[4696]: E0318 15:37:18.592697 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:20.592681456 +0000 UTC m=+83.598855692 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.597119 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:18 crc kubenswrapper[4696]: E0318 15:37:18.597282 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.597376 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:18 crc kubenswrapper[4696]: E0318 15:37:18.597461 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.597576 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:18 crc kubenswrapper[4696]: E0318 15:37:18.597727 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.677486 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.677571 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.677586 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.677607 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.677621 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:18Z","lastTransitionTime":"2026-03-18T15:37:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.693204 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.693271 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:18 crc kubenswrapper[4696]: E0318 15:37:18.693455 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:37:18 crc kubenswrapper[4696]: E0318 15:37:18.693478 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:37:18 crc kubenswrapper[4696]: E0318 15:37:18.693494 4696 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:18 crc kubenswrapper[4696]: E0318 15:37:18.693583 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:20.693561449 +0000 UTC m=+83.699735665 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:18 crc kubenswrapper[4696]: E0318 15:37:18.693665 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:37:18 crc kubenswrapper[4696]: E0318 15:37:18.693715 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:37:18 crc kubenswrapper[4696]: E0318 15:37:18.693741 4696 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:18 crc kubenswrapper[4696]: E0318 15:37:18.693847 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:20.693809194 +0000 UTC m=+83.699983570 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.781403 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.781483 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.781507 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.781585 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.781610 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:18Z","lastTransitionTime":"2026-03-18T15:37:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.885172 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.885250 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.885268 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.885293 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.885310 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:18Z","lastTransitionTime":"2026-03-18T15:37:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.988383 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.988454 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.988468 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.988492 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:18 crc kubenswrapper[4696]: I0318 15:37:18.988507 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:18Z","lastTransitionTime":"2026-03-18T15:37:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.091702 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.091749 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.091758 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.091777 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.091787 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:19Z","lastTransitionTime":"2026-03-18T15:37:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.199286 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.199336 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.199348 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.199365 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.199382 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:19Z","lastTransitionTime":"2026-03-18T15:37:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.301673 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.301775 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.301802 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.301839 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.301863 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:19Z","lastTransitionTime":"2026-03-18T15:37:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.403854 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.403903 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.403912 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.403927 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.403938 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:19Z","lastTransitionTime":"2026-03-18T15:37:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.506511 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.506593 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.506607 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.506629 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.506646 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:19Z","lastTransitionTime":"2026-03-18T15:37:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.609126 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.609739 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.609942 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.610142 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.610381 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:19Z","lastTransitionTime":"2026-03-18T15:37:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.616734 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.617838 4696 scope.go:117] "RemoveContainer" containerID="9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe" Mar 18 15:37:19 crc kubenswrapper[4696]: E0318 15:37:19.618423 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.713134 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.713488 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.713767 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.713877 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.713961 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:19Z","lastTransitionTime":"2026-03-18T15:37:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.816785 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.816894 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.816911 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.816933 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.816947 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:19Z","lastTransitionTime":"2026-03-18T15:37:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.921201 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.921278 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.921297 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.921331 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.921351 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:19Z","lastTransitionTime":"2026-03-18T15:37:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:19 crc kubenswrapper[4696]: I0318 15:37:19.950260 4696 scope.go:117] "RemoveContainer" containerID="9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe" Mar 18 15:37:19 crc kubenswrapper[4696]: E0318 15:37:19.950567 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.024694 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.024747 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.024758 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.024779 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.024791 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:20Z","lastTransitionTime":"2026-03-18T15:37:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.127803 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.127884 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.127902 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.127958 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.127982 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:20Z","lastTransitionTime":"2026-03-18T15:37:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.231298 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.231379 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.231399 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.231428 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.231450 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:20Z","lastTransitionTime":"2026-03-18T15:37:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.334509 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.334592 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.334605 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.334629 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.334645 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:20Z","lastTransitionTime":"2026-03-18T15:37:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.438384 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.438450 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.438460 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.438478 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.438492 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:20Z","lastTransitionTime":"2026-03-18T15:37:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.541638 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.541682 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.541701 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.541720 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.541732 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:20Z","lastTransitionTime":"2026-03-18T15:37:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.596818 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.596947 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.597016 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:20 crc kubenswrapper[4696]: E0318 15:37:20.597117 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:20 crc kubenswrapper[4696]: E0318 15:37:20.597020 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:20 crc kubenswrapper[4696]: E0318 15:37:20.597268 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.612786 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.612916 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.612951 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:20 crc kubenswrapper[4696]: E0318 15:37:20.613106 4696 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:37:20 crc kubenswrapper[4696]: E0318 15:37:20.613183 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:24.613155009 +0000 UTC m=+87.619329225 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:37:20 crc kubenswrapper[4696]: E0318 15:37:20.613267 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:24.613257781 +0000 UTC m=+87.619431987 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:20 crc kubenswrapper[4696]: E0318 15:37:20.613311 4696 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:37:20 crc kubenswrapper[4696]: E0318 15:37:20.613337 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:24.613330452 +0000 UTC m=+87.619504658 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.645380 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.645764 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.645960 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.646231 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.646424 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:20Z","lastTransitionTime":"2026-03-18T15:37:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.714373 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:20 crc kubenswrapper[4696]: E0318 15:37:20.714791 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:37:20 crc kubenswrapper[4696]: E0318 15:37:20.714858 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:37:20 crc kubenswrapper[4696]: E0318 15:37:20.714881 4696 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:20 crc kubenswrapper[4696]: E0318 15:37:20.714975 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:24.714945661 +0000 UTC m=+87.721120047 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:20 crc kubenswrapper[4696]: E0318 15:37:20.715167 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:37:20 crc kubenswrapper[4696]: E0318 15:37:20.715209 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:37:20 crc kubenswrapper[4696]: E0318 15:37:20.715230 4696 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:20 crc kubenswrapper[4696]: E0318 15:37:20.715334 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:24.715298189 +0000 UTC m=+87.721472545 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.714829 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.750346 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.750397 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.750413 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.750433 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.750447 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:20Z","lastTransitionTime":"2026-03-18T15:37:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.854327 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.854680 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.854760 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.854836 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.854909 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:20Z","lastTransitionTime":"2026-03-18T15:37:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.956920 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.957291 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.957395 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.957572 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:20 crc kubenswrapper[4696]: I0318 15:37:20.957656 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:20Z","lastTransitionTime":"2026-03-18T15:37:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.060619 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.061579 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.061760 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.061880 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.061995 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:21Z","lastTransitionTime":"2026-03-18T15:37:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.165285 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.165350 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.165379 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.165413 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.165431 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:21Z","lastTransitionTime":"2026-03-18T15:37:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.268988 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.269037 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.269052 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.269070 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.269082 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:21Z","lastTransitionTime":"2026-03-18T15:37:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.371634 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.371668 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.371676 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.371688 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.371697 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:21Z","lastTransitionTime":"2026-03-18T15:37:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.473923 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.473978 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.473990 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.474012 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.474025 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:21Z","lastTransitionTime":"2026-03-18T15:37:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.576921 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.576964 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.576975 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.576992 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.577003 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:21Z","lastTransitionTime":"2026-03-18T15:37:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.679716 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.679762 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.679773 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.679792 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.679804 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:21Z","lastTransitionTime":"2026-03-18T15:37:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.782903 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.782954 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.782966 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.782990 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.783004 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:21Z","lastTransitionTime":"2026-03-18T15:37:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.885949 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.886000 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.886016 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.886037 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.886051 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:21Z","lastTransitionTime":"2026-03-18T15:37:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.957653 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d"} Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.972436 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.989008 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.989059 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.989076 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.989102 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.989121 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:21Z","lastTransitionTime":"2026-03-18T15:37:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:21 crc kubenswrapper[4696]: I0318 15:37:21.996935 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.016167 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:22Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.038131 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:22Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.053398 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:22Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.070366 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:22Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.089065 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:22Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.092669 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.092715 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.092730 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.092750 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.092767 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:22Z","lastTransitionTime":"2026-03-18T15:37:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.196857 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.196953 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.196983 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.197021 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.197054 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:22Z","lastTransitionTime":"2026-03-18T15:37:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.300439 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.300501 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.300537 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.300563 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.300576 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:22Z","lastTransitionTime":"2026-03-18T15:37:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.412509 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.412589 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.412600 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.412617 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.412629 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:22Z","lastTransitionTime":"2026-03-18T15:37:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.515447 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.515576 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.515608 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.515645 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.515674 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:22Z","lastTransitionTime":"2026-03-18T15:37:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.597499 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.597567 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.597751 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:22 crc kubenswrapper[4696]: E0318 15:37:22.597872 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:22 crc kubenswrapper[4696]: E0318 15:37:22.598345 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:22 crc kubenswrapper[4696]: E0318 15:37:22.598404 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.618835 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.618886 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.618895 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.618915 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.618926 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:22Z","lastTransitionTime":"2026-03-18T15:37:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.722226 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.722280 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.722298 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.722325 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.722343 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:22Z","lastTransitionTime":"2026-03-18T15:37:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.825514 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.825623 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.825648 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.825691 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.825733 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:22Z","lastTransitionTime":"2026-03-18T15:37:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.928624 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.928711 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.928725 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.928770 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:22 crc kubenswrapper[4696]: I0318 15:37:22.928787 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:22Z","lastTransitionTime":"2026-03-18T15:37:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.031725 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.031872 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.031898 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.032006 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.032069 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:23Z","lastTransitionTime":"2026-03-18T15:37:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.135872 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.135934 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.135948 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.135970 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.135984 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:23Z","lastTransitionTime":"2026-03-18T15:37:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.240742 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.241115 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.241211 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.241320 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.241405 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:23Z","lastTransitionTime":"2026-03-18T15:37:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.346563 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.346657 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.346694 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.346737 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.346764 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:23Z","lastTransitionTime":"2026-03-18T15:37:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.449779 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.449828 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.449841 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.449865 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.449883 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:23Z","lastTransitionTime":"2026-03-18T15:37:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.553339 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.553385 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.553394 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.553411 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.553422 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:23Z","lastTransitionTime":"2026-03-18T15:37:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.656362 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.656406 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.656415 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.656430 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.656442 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:23Z","lastTransitionTime":"2026-03-18T15:37:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.759865 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.759917 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.759929 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.759949 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.759962 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:23Z","lastTransitionTime":"2026-03-18T15:37:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.863951 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.864004 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.864017 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.864051 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.864064 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:23Z","lastTransitionTime":"2026-03-18T15:37:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.966206 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.966240 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.966247 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.966260 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:23 crc kubenswrapper[4696]: I0318 15:37:23.966271 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:23Z","lastTransitionTime":"2026-03-18T15:37:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.069567 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.069628 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.069639 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.069659 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.069673 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:24Z","lastTransitionTime":"2026-03-18T15:37:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.172061 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.172127 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.172139 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.172160 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.172174 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:24Z","lastTransitionTime":"2026-03-18T15:37:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.274707 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.274739 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.274747 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.274761 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.274770 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:24Z","lastTransitionTime":"2026-03-18T15:37:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.378094 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.378145 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.378158 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.378178 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.378196 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:24Z","lastTransitionTime":"2026-03-18T15:37:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.480988 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.481057 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.481071 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.481094 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.481111 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:24Z","lastTransitionTime":"2026-03-18T15:37:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.584266 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.584359 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.584379 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.584411 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.584432 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:24Z","lastTransitionTime":"2026-03-18T15:37:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.596598 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.596667 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:24 crc kubenswrapper[4696]: E0318 15:37:24.596741 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.596819 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:24 crc kubenswrapper[4696]: E0318 15:37:24.596974 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:24 crc kubenswrapper[4696]: E0318 15:37:24.597225 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.652671 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.652763 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.652791 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:24 crc kubenswrapper[4696]: E0318 15:37:24.652889 4696 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:37:24 crc kubenswrapper[4696]: E0318 15:37:24.652906 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:32.652852608 +0000 UTC m=+95.659026834 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:24 crc kubenswrapper[4696]: E0318 15:37:24.652954 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:32.65294345 +0000 UTC m=+95.659117666 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:37:24 crc kubenswrapper[4696]: E0318 15:37:24.652985 4696 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:37:24 crc kubenswrapper[4696]: E0318 15:37:24.653108 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:32.653080523 +0000 UTC m=+95.659254919 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.687625 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.687679 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.687693 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.687716 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.687732 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:24Z","lastTransitionTime":"2026-03-18T15:37:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.753360 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.753434 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:24 crc kubenswrapper[4696]: E0318 15:37:24.753585 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:37:24 crc kubenswrapper[4696]: E0318 15:37:24.753603 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:37:24 crc kubenswrapper[4696]: E0318 15:37:24.753615 4696 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:24 crc kubenswrapper[4696]: E0318 15:37:24.753609 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:37:24 crc kubenswrapper[4696]: E0318 15:37:24.753644 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:37:24 crc kubenswrapper[4696]: E0318 15:37:24.753656 4696 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:24 crc kubenswrapper[4696]: E0318 15:37:24.753672 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:32.753656318 +0000 UTC m=+95.759830524 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:24 crc kubenswrapper[4696]: E0318 15:37:24.753716 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:32.753694739 +0000 UTC m=+95.759868945 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.789906 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.789951 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.789961 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.789975 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.789984 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:24Z","lastTransitionTime":"2026-03-18T15:37:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.893121 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.893173 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.893182 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.893196 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.893204 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:24Z","lastTransitionTime":"2026-03-18T15:37:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.996203 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.996269 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.996284 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.996317 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:24 crc kubenswrapper[4696]: I0318 15:37:24.996334 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:24Z","lastTransitionTime":"2026-03-18T15:37:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.100054 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.100111 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.100127 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.100146 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.100156 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:25Z","lastTransitionTime":"2026-03-18T15:37:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.203290 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.203343 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.203355 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.203378 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.203395 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:25Z","lastTransitionTime":"2026-03-18T15:37:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.305811 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.305865 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.305882 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.305901 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.305915 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:25Z","lastTransitionTime":"2026-03-18T15:37:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.409335 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.409399 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.409412 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.409431 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.409445 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:25Z","lastTransitionTime":"2026-03-18T15:37:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.512382 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.512449 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.512466 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.512489 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.512503 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:25Z","lastTransitionTime":"2026-03-18T15:37:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.614949 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.615006 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.615017 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.615036 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.615047 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:25Z","lastTransitionTime":"2026-03-18T15:37:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.717020 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.717064 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.717078 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.717095 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.717107 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:25Z","lastTransitionTime":"2026-03-18T15:37:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.825834 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.825872 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.825880 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.825895 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.825904 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:25Z","lastTransitionTime":"2026-03-18T15:37:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.928605 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.928698 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.928720 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.928753 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:25 crc kubenswrapper[4696]: I0318 15:37:25.928774 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:25Z","lastTransitionTime":"2026-03-18T15:37:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.031909 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.031954 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.031965 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.031985 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.032002 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:26Z","lastTransitionTime":"2026-03-18T15:37:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.135134 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.135208 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.135227 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.135254 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.135274 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:26Z","lastTransitionTime":"2026-03-18T15:37:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.238652 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.238715 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.238730 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.238752 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.238766 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:26Z","lastTransitionTime":"2026-03-18T15:37:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.344290 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.344367 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.344388 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.344413 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.344431 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:26Z","lastTransitionTime":"2026-03-18T15:37:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.448262 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.448328 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.448347 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.448374 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.448392 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:26Z","lastTransitionTime":"2026-03-18T15:37:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.552292 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.552349 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.552359 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.552378 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.552389 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:26Z","lastTransitionTime":"2026-03-18T15:37:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.597123 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.597123 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:26 crc kubenswrapper[4696]: E0318 15:37:26.597279 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:26 crc kubenswrapper[4696]: E0318 15:37:26.597326 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.597151 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:26 crc kubenswrapper[4696]: E0318 15:37:26.597405 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.655587 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.655627 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.655636 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.655656 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.655669 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:26Z","lastTransitionTime":"2026-03-18T15:37:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.758446 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.758483 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.758492 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.758508 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.758541 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:26Z","lastTransitionTime":"2026-03-18T15:37:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.861339 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.861389 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.861403 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.861428 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.861442 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:26Z","lastTransitionTime":"2026-03-18T15:37:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.968626 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.968677 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.968689 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.968710 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:26 crc kubenswrapper[4696]: I0318 15:37:26.968721 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:26Z","lastTransitionTime":"2026-03-18T15:37:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.071502 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.071549 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.071562 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.071580 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.071592 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:27Z","lastTransitionTime":"2026-03-18T15:37:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.174560 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.174621 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.174634 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.174662 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.174677 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:27Z","lastTransitionTime":"2026-03-18T15:37:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.277917 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.277983 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.277996 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.278018 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.278031 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:27Z","lastTransitionTime":"2026-03-18T15:37:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.353007 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.353066 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.353078 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.353098 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.353120 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:27Z","lastTransitionTime":"2026-03-18T15:37:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:27 crc kubenswrapper[4696]: E0318 15:37:27.383893 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.390414 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.390495 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.390507 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.390538 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.390549 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:27Z","lastTransitionTime":"2026-03-18T15:37:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:27 crc kubenswrapper[4696]: E0318 15:37:27.409853 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.414409 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.414469 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.414483 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.414505 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.414533 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:27Z","lastTransitionTime":"2026-03-18T15:37:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:27 crc kubenswrapper[4696]: E0318 15:37:27.433115 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.438405 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.438440 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.438452 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.438469 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.438479 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:27Z","lastTransitionTime":"2026-03-18T15:37:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:27 crc kubenswrapper[4696]: E0318 15:37:27.460976 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.466227 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.466311 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.466336 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.466367 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.466391 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:27Z","lastTransitionTime":"2026-03-18T15:37:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:27 crc kubenswrapper[4696]: E0318 15:37:27.482560 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:27 crc kubenswrapper[4696]: E0318 15:37:27.482912 4696 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.485848 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.485922 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.485943 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.485978 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.486001 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:27Z","lastTransitionTime":"2026-03-18T15:37:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.589280 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.589361 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.589380 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.589410 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.589429 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:27Z","lastTransitionTime":"2026-03-18T15:37:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.623642 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.648799 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.666150 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.685098 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.692895 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.692966 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.692987 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.693016 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.693035 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:27Z","lastTransitionTime":"2026-03-18T15:37:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.706874 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.723135 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.739614 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.795911 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.795990 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.796008 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.796037 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.796055 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:27Z","lastTransitionTime":"2026-03-18T15:37:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.899682 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.899732 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.899742 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.899757 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:27 crc kubenswrapper[4696]: I0318 15:37:27.899768 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:27Z","lastTransitionTime":"2026-03-18T15:37:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.002359 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.002417 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.002427 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.002447 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.002463 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:28Z","lastTransitionTime":"2026-03-18T15:37:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.104768 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.104800 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.104809 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.104822 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.104833 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:28Z","lastTransitionTime":"2026-03-18T15:37:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.208045 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.208094 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.208108 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.208128 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.208143 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:28Z","lastTransitionTime":"2026-03-18T15:37:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.310701 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.310929 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.310941 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.310956 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.310965 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:28Z","lastTransitionTime":"2026-03-18T15:37:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.413606 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.413656 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.413669 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.413688 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.413702 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:28Z","lastTransitionTime":"2026-03-18T15:37:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.515412 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.515456 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.515464 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.515479 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.515496 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:28Z","lastTransitionTime":"2026-03-18T15:37:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.597414 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.597494 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.597450 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:28 crc kubenswrapper[4696]: E0318 15:37:28.597597 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:28 crc kubenswrapper[4696]: E0318 15:37:28.597676 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:28 crc kubenswrapper[4696]: E0318 15:37:28.597801 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.617938 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.617968 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.617976 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.617989 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.617998 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:28Z","lastTransitionTime":"2026-03-18T15:37:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.721045 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.721104 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.721138 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.721163 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.721179 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:28Z","lastTransitionTime":"2026-03-18T15:37:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.824591 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.824644 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.824654 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.824672 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.824681 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:28Z","lastTransitionTime":"2026-03-18T15:37:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.927044 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.927079 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.927086 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.927115 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:28 crc kubenswrapper[4696]: I0318 15:37:28.927124 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:28Z","lastTransitionTime":"2026-03-18T15:37:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.030390 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.030441 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.030451 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.030469 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.030482 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:29Z","lastTransitionTime":"2026-03-18T15:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.133771 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.134009 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.134018 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.134035 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.134045 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:29Z","lastTransitionTime":"2026-03-18T15:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.236885 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.236941 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.236958 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.236977 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.236989 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:29Z","lastTransitionTime":"2026-03-18T15:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.339678 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.339737 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.339754 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.339773 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.339785 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:29Z","lastTransitionTime":"2026-03-18T15:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.443549 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.443640 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.443664 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.443699 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.443722 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:29Z","lastTransitionTime":"2026-03-18T15:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.545634 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.545684 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.545696 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.545715 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.545730 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:29Z","lastTransitionTime":"2026-03-18T15:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.647966 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.648014 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.648028 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.648048 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.648061 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:29Z","lastTransitionTime":"2026-03-18T15:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.751037 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.751075 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.751086 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.751103 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.751116 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:29Z","lastTransitionTime":"2026-03-18T15:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.854088 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.854136 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.854148 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.854165 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.854178 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:29Z","lastTransitionTime":"2026-03-18T15:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.957344 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.957387 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.957396 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.957413 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:29 crc kubenswrapper[4696]: I0318 15:37:29.957423 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:29Z","lastTransitionTime":"2026-03-18T15:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.059698 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.059736 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.059762 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.059780 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.059793 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:30Z","lastTransitionTime":"2026-03-18T15:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.162220 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.162287 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.162301 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.162319 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.162341 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:30Z","lastTransitionTime":"2026-03-18T15:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.265850 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.265888 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.265900 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.265916 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.265926 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:30Z","lastTransitionTime":"2026-03-18T15:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.369314 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.369385 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.369395 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.369412 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.369423 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:30Z","lastTransitionTime":"2026-03-18T15:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.472668 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.473741 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.473784 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.473817 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.473857 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:30Z","lastTransitionTime":"2026-03-18T15:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.576472 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.576594 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.576611 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.576634 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.576649 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:30Z","lastTransitionTime":"2026-03-18T15:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.596738 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.596818 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:30 crc kubenswrapper[4696]: E0318 15:37:30.596894 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.597145 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:30 crc kubenswrapper[4696]: E0318 15:37:30.597354 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:30 crc kubenswrapper[4696]: E0318 15:37:30.598548 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.612214 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.679269 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.679579 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.679667 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.679747 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.679813 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:30Z","lastTransitionTime":"2026-03-18T15:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.782776 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.783054 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.783115 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.783209 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.783277 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:30Z","lastTransitionTime":"2026-03-18T15:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.885458 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.885567 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.885591 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.885621 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.885642 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:30Z","lastTransitionTime":"2026-03-18T15:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.988078 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.988183 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.988199 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.988220 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:30 crc kubenswrapper[4696]: I0318 15:37:30.988235 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:30Z","lastTransitionTime":"2026-03-18T15:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.090958 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.091005 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.091016 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.091035 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.091047 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:31Z","lastTransitionTime":"2026-03-18T15:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.193808 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.193884 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.193896 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.193914 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.193924 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:31Z","lastTransitionTime":"2026-03-18T15:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.296586 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.296632 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.296643 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.296661 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.296674 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:31Z","lastTransitionTime":"2026-03-18T15:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.400717 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.400881 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.400896 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.400916 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.400930 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:31Z","lastTransitionTime":"2026-03-18T15:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.504026 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.504087 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.504100 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.504120 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.504136 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:31Z","lastTransitionTime":"2026-03-18T15:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.607082 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.607169 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.607185 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.607201 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.607234 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:31Z","lastTransitionTime":"2026-03-18T15:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.711174 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.711282 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.711307 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.711351 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.711373 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:31Z","lastTransitionTime":"2026-03-18T15:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.813885 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.813920 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.813930 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.813944 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.813954 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:31Z","lastTransitionTime":"2026-03-18T15:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.916489 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.916590 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.916602 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.916622 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:31 crc kubenswrapper[4696]: I0318 15:37:31.916636 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:31Z","lastTransitionTime":"2026-03-18T15:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.019347 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.019397 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.019409 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.019426 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.019438 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:32Z","lastTransitionTime":"2026-03-18T15:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.121945 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.121977 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.121990 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.122006 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.122015 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:32Z","lastTransitionTime":"2026-03-18T15:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.224898 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.224953 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.224968 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.224991 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.225006 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:32Z","lastTransitionTime":"2026-03-18T15:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.327706 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.327747 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.327758 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.327777 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.327789 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:32Z","lastTransitionTime":"2026-03-18T15:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.430538 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.430606 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.430622 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.430655 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.430673 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:32Z","lastTransitionTime":"2026-03-18T15:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.533400 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.533458 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.533470 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.533490 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.533504 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:32Z","lastTransitionTime":"2026-03-18T15:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.597222 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.597277 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.597225 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:32 crc kubenswrapper[4696]: E0318 15:37:32.597434 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:32 crc kubenswrapper[4696]: E0318 15:37:32.597896 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:32 crc kubenswrapper[4696]: E0318 15:37:32.597989 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.635985 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.636064 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.636089 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.636124 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.636149 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:32Z","lastTransitionTime":"2026-03-18T15:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.728763 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.728914 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:32 crc kubenswrapper[4696]: E0318 15:37:32.728946 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:48.728916821 +0000 UTC m=+111.735091027 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.728976 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:32 crc kubenswrapper[4696]: E0318 15:37:32.729018 4696 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:37:32 crc kubenswrapper[4696]: E0318 15:37:32.729069 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:48.729058135 +0000 UTC m=+111.735232341 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:37:32 crc kubenswrapper[4696]: E0318 15:37:32.729109 4696 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:37:32 crc kubenswrapper[4696]: E0318 15:37:32.729150 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:48.729141966 +0000 UTC m=+111.735316172 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.739394 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.739465 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.739476 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.739509 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.739535 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:32Z","lastTransitionTime":"2026-03-18T15:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.830076 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.830390 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:32 crc kubenswrapper[4696]: E0318 15:37:32.830309 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:37:32 crc kubenswrapper[4696]: E0318 15:37:32.830721 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:37:32 crc kubenswrapper[4696]: E0318 15:37:32.830814 4696 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:32 crc kubenswrapper[4696]: E0318 15:37:32.830588 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:37:32 crc kubenswrapper[4696]: E0318 15:37:32.830914 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:37:32 crc kubenswrapper[4696]: E0318 15:37:32.830928 4696 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:32 crc kubenswrapper[4696]: E0318 15:37:32.830995 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:48.83097671 +0000 UTC m=+111.837150916 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:32 crc kubenswrapper[4696]: E0318 15:37:32.831114 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:48.831099033 +0000 UTC m=+111.837273239 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.841454 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.841590 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.841689 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.841778 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.841862 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:32Z","lastTransitionTime":"2026-03-18T15:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.944265 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.944581 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.944678 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.944785 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:32 crc kubenswrapper[4696]: I0318 15:37:32.944872 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:32Z","lastTransitionTime":"2026-03-18T15:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.048024 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.048085 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.048103 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.048122 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.048134 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:33Z","lastTransitionTime":"2026-03-18T15:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.150515 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.150566 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.150575 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.150591 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.150602 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:33Z","lastTransitionTime":"2026-03-18T15:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.252560 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.252617 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.252635 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.252662 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.252679 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:33Z","lastTransitionTime":"2026-03-18T15:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.356171 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.356479 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.356610 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.356721 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.356812 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:33Z","lastTransitionTime":"2026-03-18T15:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.459699 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.459749 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.459761 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.459781 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.459797 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:33Z","lastTransitionTime":"2026-03-18T15:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.562405 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.562479 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.562498 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.562550 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.562572 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:33Z","lastTransitionTime":"2026-03-18T15:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.665644 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.665717 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.665727 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.665742 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.665754 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:33Z","lastTransitionTime":"2026-03-18T15:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.768437 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.768482 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.768492 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.768506 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.768515 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:33Z","lastTransitionTime":"2026-03-18T15:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.871104 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.871162 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.871176 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.871196 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.871212 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:33Z","lastTransitionTime":"2026-03-18T15:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.974663 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.974763 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.974784 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.974818 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:33 crc kubenswrapper[4696]: I0318 15:37:33.974845 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:33Z","lastTransitionTime":"2026-03-18T15:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.078819 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.078892 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.078907 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.078931 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.078951 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:34Z","lastTransitionTime":"2026-03-18T15:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.181623 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.181715 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.181737 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.181772 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.181793 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:34Z","lastTransitionTime":"2026-03-18T15:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.284107 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.284181 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.284201 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.284217 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.284227 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:34Z","lastTransitionTime":"2026-03-18T15:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.386765 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.386818 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.386829 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.386848 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.386861 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:34Z","lastTransitionTime":"2026-03-18T15:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.489305 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.489355 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.489391 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.489410 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.489426 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:34Z","lastTransitionTime":"2026-03-18T15:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.591704 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.592162 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.592270 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.592363 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.592431 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:34Z","lastTransitionTime":"2026-03-18T15:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.596982 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.596982 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.596982 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:34 crc kubenswrapper[4696]: E0318 15:37:34.597304 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:34 crc kubenswrapper[4696]: E0318 15:37:34.597427 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.597507 4696 scope.go:117] "RemoveContainer" containerID="9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe" Mar 18 15:37:34 crc kubenswrapper[4696]: E0318 15:37:34.597696 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:37:34 crc kubenswrapper[4696]: E0318 15:37:34.597712 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.695784 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.695847 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.695861 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.695883 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.695898 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:34Z","lastTransitionTime":"2026-03-18T15:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.799630 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.799694 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.799706 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.799724 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.799735 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:34Z","lastTransitionTime":"2026-03-18T15:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.902544 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.902599 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.902610 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.902629 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:34 crc kubenswrapper[4696]: I0318 15:37:34.902641 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:34Z","lastTransitionTime":"2026-03-18T15:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.005742 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.005793 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.005805 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.005827 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.005841 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:35Z","lastTransitionTime":"2026-03-18T15:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.109354 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.109407 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.109419 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.109436 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.109451 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:35Z","lastTransitionTime":"2026-03-18T15:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.212777 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.212827 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.212837 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.212856 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.212866 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:35Z","lastTransitionTime":"2026-03-18T15:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.310818 4696 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.315549 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.315611 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.315624 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.315644 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.315659 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:35Z","lastTransitionTime":"2026-03-18T15:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.418404 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.418465 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.418476 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.418514 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.418566 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:35Z","lastTransitionTime":"2026-03-18T15:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.522623 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.522672 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.522682 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.522698 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.522709 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:35Z","lastTransitionTime":"2026-03-18T15:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.625803 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.625872 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.625896 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.625928 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.625950 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:35Z","lastTransitionTime":"2026-03-18T15:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.729101 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.729511 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.729623 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.729721 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.729816 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:35Z","lastTransitionTime":"2026-03-18T15:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.833423 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.833478 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.833489 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.833508 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.833549 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:35Z","lastTransitionTime":"2026-03-18T15:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.938484 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.938551 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.938563 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.938581 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:35 crc kubenswrapper[4696]: I0318 15:37:35.938590 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:35Z","lastTransitionTime":"2026-03-18T15:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.041465 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.041874 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.041994 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.042089 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.042171 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:36Z","lastTransitionTime":"2026-03-18T15:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.145057 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.145310 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.145402 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.145505 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.145640 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:36Z","lastTransitionTime":"2026-03-18T15:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.248387 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.248440 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.248461 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.248492 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.248554 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:36Z","lastTransitionTime":"2026-03-18T15:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.351820 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.351892 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.351915 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.352215 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.352249 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:36Z","lastTransitionTime":"2026-03-18T15:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.455875 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.456338 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.456381 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.456417 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.456455 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:36Z","lastTransitionTime":"2026-03-18T15:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.560807 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.560898 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.560923 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.560956 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.560981 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:36Z","lastTransitionTime":"2026-03-18T15:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.596430 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.596562 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:36 crc kubenswrapper[4696]: E0318 15:37:36.596621 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.596453 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:36 crc kubenswrapper[4696]: E0318 15:37:36.596729 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:36 crc kubenswrapper[4696]: E0318 15:37:36.596877 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.663809 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.663881 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.663896 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.663913 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.663926 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:36Z","lastTransitionTime":"2026-03-18T15:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.767034 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.767090 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.767102 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.767121 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.767134 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:36Z","lastTransitionTime":"2026-03-18T15:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.869944 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.870021 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.870039 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.870068 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.870090 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:36Z","lastTransitionTime":"2026-03-18T15:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.973686 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.973802 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.973813 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.973841 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:36 crc kubenswrapper[4696]: I0318 15:37:36.973862 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:36Z","lastTransitionTime":"2026-03-18T15:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.076896 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.076964 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.076977 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.077002 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.077015 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:37Z","lastTransitionTime":"2026-03-18T15:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.178759 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.178838 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.178868 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.178907 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.178935 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:37Z","lastTransitionTime":"2026-03-18T15:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.281714 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.281797 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.281820 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.281850 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.281874 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:37Z","lastTransitionTime":"2026-03-18T15:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.384726 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.384780 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.384789 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.384805 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.384816 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:37Z","lastTransitionTime":"2026-03-18T15:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.487844 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.487927 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.487946 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.487975 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.487998 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:37Z","lastTransitionTime":"2026-03-18T15:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.559009 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.559043 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.559051 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.559065 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.559074 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:37Z","lastTransitionTime":"2026-03-18T15:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:37 crc kubenswrapper[4696]: E0318 15:37:37.577592 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.582356 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.582482 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.582589 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.582684 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.582780 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:37Z","lastTransitionTime":"2026-03-18T15:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:37 crc kubenswrapper[4696]: E0318 15:37:37.594032 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.602000 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.602051 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.602061 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.602081 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.602092 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:37Z","lastTransitionTime":"2026-03-18T15:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:37 crc kubenswrapper[4696]: E0318 15:37:37.627099 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.634735 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.637311 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.637385 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.637399 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.637422 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.637438 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:37Z","lastTransitionTime":"2026-03-18T15:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.662900 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:37 crc kubenswrapper[4696]: E0318 15:37:37.678279 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.683020 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.683223 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.683315 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.683385 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.683442 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:37Z","lastTransitionTime":"2026-03-18T15:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.692340 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:37 crc kubenswrapper[4696]: E0318 15:37:37.701029 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:37 crc kubenswrapper[4696]: E0318 15:37:37.702961 4696 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.705574 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.705617 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.705627 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.705644 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.705655 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:37Z","lastTransitionTime":"2026-03-18T15:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.723396 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.739628 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.755733 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.770446 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.782157 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.809065 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.809118 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.809127 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.809144 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.809156 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:37Z","lastTransitionTime":"2026-03-18T15:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.912676 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.912764 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.912777 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.912801 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:37 crc kubenswrapper[4696]: I0318 15:37:37.912818 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:37Z","lastTransitionTime":"2026-03-18T15:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.015021 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.015597 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.015839 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.016117 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.016363 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:38Z","lastTransitionTime":"2026-03-18T15:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.120276 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.120340 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.120358 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.120383 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.120404 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:38Z","lastTransitionTime":"2026-03-18T15:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.224030 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.224108 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.224127 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.224161 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.224185 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:38Z","lastTransitionTime":"2026-03-18T15:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.328594 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.329021 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.329115 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.329217 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.329298 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:38Z","lastTransitionTime":"2026-03-18T15:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.434089 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.434581 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.434752 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.434859 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.434963 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:38Z","lastTransitionTime":"2026-03-18T15:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.538396 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.538470 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.538491 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.538547 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.538567 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:38Z","lastTransitionTime":"2026-03-18T15:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.597213 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.597213 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:38 crc kubenswrapper[4696]: E0318 15:37:38.597389 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:38 crc kubenswrapper[4696]: E0318 15:37:38.597499 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.597247 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:38 crc kubenswrapper[4696]: E0318 15:37:38.597680 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.642836 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.642874 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.642885 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.642901 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.642913 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:38Z","lastTransitionTime":"2026-03-18T15:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.747277 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.747349 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.747368 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.747399 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.747420 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:38Z","lastTransitionTime":"2026-03-18T15:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.850383 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.850887 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.851094 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.851338 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.851572 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:38Z","lastTransitionTime":"2026-03-18T15:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.955122 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.955615 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.955818 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.956062 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:38 crc kubenswrapper[4696]: I0318 15:37:38.956268 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:38Z","lastTransitionTime":"2026-03-18T15:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.059577 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.059628 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.059638 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.059657 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.059668 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:39Z","lastTransitionTime":"2026-03-18T15:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.162863 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.162955 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.162970 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.162992 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.163008 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:39Z","lastTransitionTime":"2026-03-18T15:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.266946 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.267005 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.267022 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.267045 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.267059 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:39Z","lastTransitionTime":"2026-03-18T15:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.370365 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.370430 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.370441 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.370460 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.370475 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:39Z","lastTransitionTime":"2026-03-18T15:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.474443 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.474497 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.474508 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.474548 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.474560 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:39Z","lastTransitionTime":"2026-03-18T15:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.577129 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.577175 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.577184 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.577205 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.577215 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:39Z","lastTransitionTime":"2026-03-18T15:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.679622 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.679670 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.679686 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.679704 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.679717 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:39Z","lastTransitionTime":"2026-03-18T15:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.783129 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.783205 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.783224 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.783253 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.783275 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:39Z","lastTransitionTime":"2026-03-18T15:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.886769 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.886843 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.886855 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.886878 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.886892 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:39Z","lastTransitionTime":"2026-03-18T15:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.991023 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.991090 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.991104 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.991130 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:39 crc kubenswrapper[4696]: I0318 15:37:39.991146 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:39Z","lastTransitionTime":"2026-03-18T15:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.094965 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.095063 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.095088 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.095122 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.095149 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:40Z","lastTransitionTime":"2026-03-18T15:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.199161 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.199800 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.199969 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.200195 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.200391 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:40Z","lastTransitionTime":"2026-03-18T15:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.303731 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.303768 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.303781 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.303801 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.303813 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:40Z","lastTransitionTime":"2026-03-18T15:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.406619 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.406659 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.406673 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.406690 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.406704 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:40Z","lastTransitionTime":"2026-03-18T15:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.513025 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.513341 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.513430 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.513546 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.513646 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:40Z","lastTransitionTime":"2026-03-18T15:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.597408 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:40 crc kubenswrapper[4696]: E0318 15:37:40.597594 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.597683 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:40 crc kubenswrapper[4696]: E0318 15:37:40.597759 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.597843 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:40 crc kubenswrapper[4696]: E0318 15:37:40.598055 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.616579 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.616624 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.616638 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.616658 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.616670 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:40Z","lastTransitionTime":"2026-03-18T15:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.719499 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.720179 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.720251 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.720324 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.720387 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:40Z","lastTransitionTime":"2026-03-18T15:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.823571 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.823644 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.823667 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.823698 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.823723 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:40Z","lastTransitionTime":"2026-03-18T15:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.926783 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.926859 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.926882 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.926917 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:40 crc kubenswrapper[4696]: I0318 15:37:40.926939 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:40Z","lastTransitionTime":"2026-03-18T15:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.029523 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.029848 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.029951 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.030041 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.030129 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:41Z","lastTransitionTime":"2026-03-18T15:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.133178 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.133263 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.133285 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.133318 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.133340 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:41Z","lastTransitionTime":"2026-03-18T15:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.236756 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.236803 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.236814 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.236830 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.236840 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:41Z","lastTransitionTime":"2026-03-18T15:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.339914 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.340216 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.340298 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.340396 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.340470 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:41Z","lastTransitionTime":"2026-03-18T15:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.443831 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.443911 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.443934 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.443969 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.443999 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:41Z","lastTransitionTime":"2026-03-18T15:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.454447 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-8l8zp"] Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.454951 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8l8zp" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.458003 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.458208 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.459886 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.474456 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.491419 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.503139 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.519778 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.536633 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.546668 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.546702 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.546712 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.546728 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.546740 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:41Z","lastTransitionTime":"2026-03-18T15:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.554686 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.590720 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.613849 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.618984 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2bcs\" (UniqueName: \"kubernetes.io/projected/d0d76f08-84b8-44cb-b179-ebc9bc26a8b7-kube-api-access-b2bcs\") pod \"node-resolver-8l8zp\" (UID: \"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\") " pod="openshift-dns/node-resolver-8l8zp" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.619047 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d0d76f08-84b8-44cb-b179-ebc9bc26a8b7-hosts-file\") pod \"node-resolver-8l8zp\" (UID: \"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\") " pod="openshift-dns/node-resolver-8l8zp" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.627899 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.649554 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.649953 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.650075 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.650145 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.650213 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:41Z","lastTransitionTime":"2026-03-18T15:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.719637 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2bcs\" (UniqueName: \"kubernetes.io/projected/d0d76f08-84b8-44cb-b179-ebc9bc26a8b7-kube-api-access-b2bcs\") pod \"node-resolver-8l8zp\" (UID: \"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\") " pod="openshift-dns/node-resolver-8l8zp" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.720261 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d0d76f08-84b8-44cb-b179-ebc9bc26a8b7-hosts-file\") pod \"node-resolver-8l8zp\" (UID: \"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\") " pod="openshift-dns/node-resolver-8l8zp" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.720402 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d0d76f08-84b8-44cb-b179-ebc9bc26a8b7-hosts-file\") pod \"node-resolver-8l8zp\" (UID: \"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\") " pod="openshift-dns/node-resolver-8l8zp" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.741881 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2bcs\" (UniqueName: \"kubernetes.io/projected/d0d76f08-84b8-44cb-b179-ebc9bc26a8b7-kube-api-access-b2bcs\") pod \"node-resolver-8l8zp\" (UID: \"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\") " pod="openshift-dns/node-resolver-8l8zp" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.753253 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.753342 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.753362 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.753384 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.753398 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:41Z","lastTransitionTime":"2026-03-18T15:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.772634 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8l8zp" Mar 18 15:37:41 crc kubenswrapper[4696]: W0318 15:37:41.791929 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0d76f08_84b8_44cb_b179_ebc9bc26a8b7.slice/crio-e218128cb43cdc10f5702bde025c2c373f9078887b0f6ebc094480458faffa94 WatchSource:0}: Error finding container e218128cb43cdc10f5702bde025c2c373f9078887b0f6ebc094480458faffa94: Status 404 returned error can't find the container with id e218128cb43cdc10f5702bde025c2c373f9078887b0f6ebc094480458faffa94 Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.848708 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-c7nz9"] Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.849212 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-jjkqr"] Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.849441 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.849780 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.851520 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.854721 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.854879 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.854886 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.855025 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.855120 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.855367 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.855552 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.855560 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.858618 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.859124 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-w9dbn"] Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.860143 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-w9dbn" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.862909 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.862966 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.864441 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.864483 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.864497 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.864522 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.864539 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:41Z","lastTransitionTime":"2026-03-18T15:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.880171 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.902200 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.919726 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.935416 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.950292 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.966512 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.967892 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.967932 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.967941 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.967958 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.967970 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:41Z","lastTransitionTime":"2026-03-18T15:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:41 crc kubenswrapper[4696]: I0318 15:37:41.984774 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.007969 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.018737 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8l8zp" event={"ID":"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7","Type":"ContainerStarted","Data":"e218128cb43cdc10f5702bde025c2c373f9078887b0f6ebc094480458faffa94"} Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.023451 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-hostroot\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.023601 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-multus-conf-dir\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.023692 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d74b6f45-9bfc-4439-b43b-03f441c544fd-proxy-tls\") pod \"machine-config-daemon-jjkqr\" (UID: \"d74b6f45-9bfc-4439-b43b-03f441c544fd\") " pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.023790 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d74b6f45-9bfc-4439-b43b-03f441c544fd-mcd-auth-proxy-config\") pod \"machine-config-daemon-jjkqr\" (UID: \"d74b6f45-9bfc-4439-b43b-03f441c544fd\") " pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.023915 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4e865105-7459-4de0-ade2-9bac1ff5f094-system-cni-dir\") pod \"multus-additional-cni-plugins-c7nz9\" (UID: \"4e865105-7459-4de0-ade2-9bac1ff5f094\") " pod="openshift-multus/multus-additional-cni-plugins-c7nz9" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.024183 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d74b6f45-9bfc-4439-b43b-03f441c544fd-rootfs\") pod \"machine-config-daemon-jjkqr\" (UID: \"d74b6f45-9bfc-4439-b43b-03f441c544fd\") " pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.024419 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-cnibin\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.024593 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-os-release\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.024826 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-host-var-lib-cni-bin\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.024916 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4e865105-7459-4de0-ade2-9bac1ff5f094-cni-binary-copy\") pod \"multus-additional-cni-plugins-c7nz9\" (UID: \"4e865105-7459-4de0-ade2-9bac1ff5f094\") " pod="openshift-multus/multus-additional-cni-plugins-c7nz9" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.024993 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4e865105-7459-4de0-ade2-9bac1ff5f094-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c7nz9\" (UID: \"4e865105-7459-4de0-ade2-9bac1ff5f094\") " pod="openshift-multus/multus-additional-cni-plugins-c7nz9" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.025085 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-host-var-lib-cni-multus\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.025185 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/49424478-cad5-4788-b01e-4ebde47480e1-multus-daemon-config\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.025263 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-host-run-multus-certs\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.025348 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/49424478-cad5-4788-b01e-4ebde47480e1-cni-binary-copy\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.025451 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-multus-socket-dir-parent\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.025641 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-host-run-k8s-cni-cncf-io\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.025733 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-host-run-netns\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.025817 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-host-var-lib-kubelet\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.025898 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-etc-kubernetes\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.025974 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-multus-cni-dir\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.026065 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcnmv\" (UniqueName: \"kubernetes.io/projected/4e865105-7459-4de0-ade2-9bac1ff5f094-kube-api-access-xcnmv\") pod \"multus-additional-cni-plugins-c7nz9\" (UID: \"4e865105-7459-4de0-ade2-9bac1ff5f094\") " pod="openshift-multus/multus-additional-cni-plugins-c7nz9" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.026137 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjv48\" (UniqueName: \"kubernetes.io/projected/d74b6f45-9bfc-4439-b43b-03f441c544fd-kube-api-access-kjv48\") pod \"machine-config-daemon-jjkqr\" (UID: \"d74b6f45-9bfc-4439-b43b-03f441c544fd\") " pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.026224 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4e865105-7459-4de0-ade2-9bac1ff5f094-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c7nz9\" (UID: \"4e865105-7459-4de0-ade2-9bac1ff5f094\") " pod="openshift-multus/multus-additional-cni-plugins-c7nz9" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.026293 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4e865105-7459-4de0-ade2-9bac1ff5f094-cnibin\") pod \"multus-additional-cni-plugins-c7nz9\" (UID: \"4e865105-7459-4de0-ade2-9bac1ff5f094\") " pod="openshift-multus/multus-additional-cni-plugins-c7nz9" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.026357 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4e865105-7459-4de0-ade2-9bac1ff5f094-os-release\") pod \"multus-additional-cni-plugins-c7nz9\" (UID: \"4e865105-7459-4de0-ade2-9bac1ff5f094\") " pod="openshift-multus/multus-additional-cni-plugins-c7nz9" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.026428 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-system-cni-dir\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.026493 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k66fz\" (UniqueName: \"kubernetes.io/projected/49424478-cad5-4788-b01e-4ebde47480e1-kube-api-access-k66fz\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.026489 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.036871 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.050198 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.065321 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.069891 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.069920 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.069932 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.069951 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.069964 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:42Z","lastTransitionTime":"2026-03-18T15:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.076043 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.088613 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.101526 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.119432 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.127648 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-multus-cni-dir\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.127762 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-multus-cni-dir\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.127881 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcnmv\" (UniqueName: \"kubernetes.io/projected/4e865105-7459-4de0-ade2-9bac1ff5f094-kube-api-access-xcnmv\") pod \"multus-additional-cni-plugins-c7nz9\" (UID: \"4e865105-7459-4de0-ade2-9bac1ff5f094\") " pod="openshift-multus/multus-additional-cni-plugins-c7nz9" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.127991 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjv48\" (UniqueName: \"kubernetes.io/projected/d74b6f45-9bfc-4439-b43b-03f441c544fd-kube-api-access-kjv48\") pod \"machine-config-daemon-jjkqr\" (UID: \"d74b6f45-9bfc-4439-b43b-03f441c544fd\") " pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.128107 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4e865105-7459-4de0-ade2-9bac1ff5f094-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c7nz9\" (UID: \"4e865105-7459-4de0-ade2-9bac1ff5f094\") " pod="openshift-multus/multus-additional-cni-plugins-c7nz9" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.128256 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4e865105-7459-4de0-ade2-9bac1ff5f094-cnibin\") pod \"multus-additional-cni-plugins-c7nz9\" (UID: \"4e865105-7459-4de0-ade2-9bac1ff5f094\") " pod="openshift-multus/multus-additional-cni-plugins-c7nz9" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.128335 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4e865105-7459-4de0-ade2-9bac1ff5f094-os-release\") pod \"multus-additional-cni-plugins-c7nz9\" (UID: \"4e865105-7459-4de0-ade2-9bac1ff5f094\") " pod="openshift-multus/multus-additional-cni-plugins-c7nz9" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.128397 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4e865105-7459-4de0-ade2-9bac1ff5f094-os-release\") pod \"multus-additional-cni-plugins-c7nz9\" (UID: \"4e865105-7459-4de0-ade2-9bac1ff5f094\") " pod="openshift-multus/multus-additional-cni-plugins-c7nz9" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.128303 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4e865105-7459-4de0-ade2-9bac1ff5f094-cnibin\") pod \"multus-additional-cni-plugins-c7nz9\" (UID: \"4e865105-7459-4de0-ade2-9bac1ff5f094\") " pod="openshift-multus/multus-additional-cni-plugins-c7nz9" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.128412 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-system-cni-dir\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.128618 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k66fz\" (UniqueName: \"kubernetes.io/projected/49424478-cad5-4788-b01e-4ebde47480e1-kube-api-access-k66fz\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.128711 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-hostroot\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.128681 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4e865105-7459-4de0-ade2-9bac1ff5f094-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c7nz9\" (UID: \"4e865105-7459-4de0-ade2-9bac1ff5f094\") " pod="openshift-multus/multus-additional-cni-plugins-c7nz9" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.128811 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-multus-conf-dir\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.128981 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-multus-conf-dir\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.128813 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-hostroot\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129011 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d74b6f45-9bfc-4439-b43b-03f441c544fd-proxy-tls\") pod \"machine-config-daemon-jjkqr\" (UID: \"d74b6f45-9bfc-4439-b43b-03f441c544fd\") " pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129087 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d74b6f45-9bfc-4439-b43b-03f441c544fd-mcd-auth-proxy-config\") pod \"machine-config-daemon-jjkqr\" (UID: \"d74b6f45-9bfc-4439-b43b-03f441c544fd\") " pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.128801 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-system-cni-dir\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129163 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4e865105-7459-4de0-ade2-9bac1ff5f094-system-cni-dir\") pod \"multus-additional-cni-plugins-c7nz9\" (UID: \"4e865105-7459-4de0-ade2-9bac1ff5f094\") " pod="openshift-multus/multus-additional-cni-plugins-c7nz9" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129137 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4e865105-7459-4de0-ade2-9bac1ff5f094-system-cni-dir\") pod \"multus-additional-cni-plugins-c7nz9\" (UID: \"4e865105-7459-4de0-ade2-9bac1ff5f094\") " pod="openshift-multus/multus-additional-cni-plugins-c7nz9" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129238 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d74b6f45-9bfc-4439-b43b-03f441c544fd-rootfs\") pod \"machine-config-daemon-jjkqr\" (UID: \"d74b6f45-9bfc-4439-b43b-03f441c544fd\") " pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129270 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-cnibin\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129294 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-os-release\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129331 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-host-var-lib-cni-bin\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129351 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-os-release\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129364 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4e865105-7459-4de0-ade2-9bac1ff5f094-cni-binary-copy\") pod \"multus-additional-cni-plugins-c7nz9\" (UID: \"4e865105-7459-4de0-ade2-9bac1ff5f094\") " pod="openshift-multus/multus-additional-cni-plugins-c7nz9" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129375 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-cnibin\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129395 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4e865105-7459-4de0-ade2-9bac1ff5f094-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c7nz9\" (UID: \"4e865105-7459-4de0-ade2-9bac1ff5f094\") " pod="openshift-multus/multus-additional-cni-plugins-c7nz9" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129403 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-host-var-lib-cni-bin\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129471 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-host-var-lib-cni-multus\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129501 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/49424478-cad5-4788-b01e-4ebde47480e1-multus-daemon-config\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129521 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-host-run-multus-certs\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129553 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/49424478-cad5-4788-b01e-4ebde47480e1-cni-binary-copy\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129570 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-multus-socket-dir-parent\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129589 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-host-run-k8s-cni-cncf-io\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129300 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d74b6f45-9bfc-4439-b43b-03f441c544fd-rootfs\") pod \"machine-config-daemon-jjkqr\" (UID: \"d74b6f45-9bfc-4439-b43b-03f441c544fd\") " pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129626 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-host-run-netns\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129605 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-host-run-netns\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129655 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-host-var-lib-cni-multus\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129688 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-host-var-lib-kubelet\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129733 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-etc-kubernetes\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129793 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-etc-kubernetes\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129836 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-host-run-multus-certs\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129836 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-host-var-lib-kubelet\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129882 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-host-run-k8s-cni-cncf-io\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.129885 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/49424478-cad5-4788-b01e-4ebde47480e1-multus-socket-dir-parent\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.130302 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d74b6f45-9bfc-4439-b43b-03f441c544fd-mcd-auth-proxy-config\") pod \"machine-config-daemon-jjkqr\" (UID: \"d74b6f45-9bfc-4439-b43b-03f441c544fd\") " pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.130441 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/49424478-cad5-4788-b01e-4ebde47480e1-multus-daemon-config\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.130503 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4e865105-7459-4de0-ade2-9bac1ff5f094-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c7nz9\" (UID: \"4e865105-7459-4de0-ade2-9bac1ff5f094\") " pod="openshift-multus/multus-additional-cni-plugins-c7nz9" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.130618 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/49424478-cad5-4788-b01e-4ebde47480e1-cni-binary-copy\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.130977 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4e865105-7459-4de0-ade2-9bac1ff5f094-cni-binary-copy\") pod \"multus-additional-cni-plugins-c7nz9\" (UID: \"4e865105-7459-4de0-ade2-9bac1ff5f094\") " pod="openshift-multus/multus-additional-cni-plugins-c7nz9" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.134187 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d74b6f45-9bfc-4439-b43b-03f441c544fd-proxy-tls\") pod \"machine-config-daemon-jjkqr\" (UID: \"d74b6f45-9bfc-4439-b43b-03f441c544fd\") " pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.139655 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.145029 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcnmv\" (UniqueName: \"kubernetes.io/projected/4e865105-7459-4de0-ade2-9bac1ff5f094-kube-api-access-xcnmv\") pod \"multus-additional-cni-plugins-c7nz9\" (UID: \"4e865105-7459-4de0-ade2-9bac1ff5f094\") " pod="openshift-multus/multus-additional-cni-plugins-c7nz9" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.146859 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k66fz\" (UniqueName: \"kubernetes.io/projected/49424478-cad5-4788-b01e-4ebde47480e1-kube-api-access-k66fz\") pod \"multus-w9dbn\" (UID: \"49424478-cad5-4788-b01e-4ebde47480e1\") " pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.149490 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjv48\" (UniqueName: \"kubernetes.io/projected/d74b6f45-9bfc-4439-b43b-03f441c544fd-kube-api-access-kjv48\") pod \"machine-config-daemon-jjkqr\" (UID: \"d74b6f45-9bfc-4439-b43b-03f441c544fd\") " pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.152442 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.163224 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.172984 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.173035 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.173048 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.173069 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.173085 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:42Z","lastTransitionTime":"2026-03-18T15:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.177658 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.183279 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.194107 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: W0318 15:37:42.194327 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd74b6f45_9bfc_4439_b43b_03f441c544fd.slice/crio-d70875b6fc7b635ab6de0191ddcec447fdf3a9bd32824b947d1d90f71921450c WatchSource:0}: Error finding container d70875b6fc7b635ab6de0191ddcec447fdf3a9bd32824b947d1d90f71921450c: Status 404 returned error can't find the container with id d70875b6fc7b635ab6de0191ddcec447fdf3a9bd32824b947d1d90f71921450c Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.197504 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.206469 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-w9dbn" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.211683 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lqxgs"] Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.212872 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.215692 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.216068 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.216117 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.216368 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.216422 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.218343 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.218402 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.220044 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.237850 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.254065 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: W0318 15:37:42.257282 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49424478_cad5_4788_b01e_4ebde47480e1.slice/crio-8b847e287da85d4be15f094111dcf055de58394a22df69ff449065a32f16c4a9 WatchSource:0}: Error finding container 8b847e287da85d4be15f094111dcf055de58394a22df69ff449065a32f16c4a9: Status 404 returned error can't find the container with id 8b847e287da85d4be15f094111dcf055de58394a22df69ff449065a32f16c4a9 Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.268963 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.275742 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.275955 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.276104 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.276278 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.276419 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:42Z","lastTransitionTime":"2026-03-18T15:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.283476 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.300019 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.314193 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.328620 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.331959 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1dc15d44-2b63-40b8-b9c8-dad533d01710-ovnkube-config\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.332002 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1dc15d44-2b63-40b8-b9c8-dad533d01710-env-overrides\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.332032 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-run-netns\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.332085 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-var-lib-openvswitch\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.332111 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-run-ovn-kubernetes\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.332154 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-systemd-units\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.332180 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-etc-openvswitch\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.332197 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-kubelet\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.332214 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.332258 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-cni-netd\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.332393 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-run-openvswitch\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.332441 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-run-ovn\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.332465 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vvzl\" (UniqueName: \"kubernetes.io/projected/1dc15d44-2b63-40b8-b9c8-dad533d01710-kube-api-access-9vvzl\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.332487 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-slash\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.332512 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-run-systemd\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.332609 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-node-log\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.332629 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-cni-bin\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.332665 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-log-socket\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.332683 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1dc15d44-2b63-40b8-b9c8-dad533d01710-ovnkube-script-lib\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.332701 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1dc15d44-2b63-40b8-b9c8-dad533d01710-ovn-node-metrics-cert\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.349426 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.373186 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.379001 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.379046 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.379056 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.379073 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.379084 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:42Z","lastTransitionTime":"2026-03-18T15:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.387474 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.404387 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.419442 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.433684 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-run-openvswitch\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.433729 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-run-ovn\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.433756 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vvzl\" (UniqueName: \"kubernetes.io/projected/1dc15d44-2b63-40b8-b9c8-dad533d01710-kube-api-access-9vvzl\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.433773 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-run-systemd\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.433793 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-slash\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.433815 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-node-log\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.433829 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-cni-bin\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.433835 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-run-ovn\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.433870 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-slash\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.433835 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-run-openvswitch\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.433852 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-log-socket\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.433901 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-log-socket\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.433915 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-node-log\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.433930 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1dc15d44-2b63-40b8-b9c8-dad533d01710-ovnkube-script-lib\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.433949 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-cni-bin\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.433954 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1dc15d44-2b63-40b8-b9c8-dad533d01710-ovn-node-metrics-cert\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.433978 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-run-systemd\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.433984 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1dc15d44-2b63-40b8-b9c8-dad533d01710-env-overrides\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.434009 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-run-netns\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.434023 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1dc15d44-2b63-40b8-b9c8-dad533d01710-ovnkube-config\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.434066 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-var-lib-openvswitch\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.434084 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-run-ovn-kubernetes\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.434110 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-systemd-units\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.434127 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-etc-openvswitch\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.434150 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-kubelet\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.434169 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.434210 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-cni-netd\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.434223 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-var-lib-openvswitch\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.434298 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-run-netns\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.434324 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-run-ovn-kubernetes\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.434346 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-systemd-units\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.434372 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-etc-openvswitch\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.434393 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-kubelet\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.434418 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.434455 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-cni-netd\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.434653 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1dc15d44-2b63-40b8-b9c8-dad533d01710-ovnkube-script-lib\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.434989 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1dc15d44-2b63-40b8-b9c8-dad533d01710-ovnkube-config\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.435033 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1dc15d44-2b63-40b8-b9c8-dad533d01710-env-overrides\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.437334 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.451898 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1dc15d44-2b63-40b8-b9c8-dad533d01710-ovn-node-metrics-cert\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.452709 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.454260 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vvzl\" (UniqueName: \"kubernetes.io/projected/1dc15d44-2b63-40b8-b9c8-dad533d01710-kube-api-access-9vvzl\") pod \"ovnkube-node-lqxgs\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.481687 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.481752 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.481765 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.481789 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.481802 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:42Z","lastTransitionTime":"2026-03-18T15:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.532507 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:42 crc kubenswrapper[4696]: W0318 15:37:42.546541 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dc15d44_2b63_40b8_b9c8_dad533d01710.slice/crio-b7d855baf66edc8258ad5042e989a341246a8d46a831759090946f8f742fdfd9 WatchSource:0}: Error finding container b7d855baf66edc8258ad5042e989a341246a8d46a831759090946f8f742fdfd9: Status 404 returned error can't find the container with id b7d855baf66edc8258ad5042e989a341246a8d46a831759090946f8f742fdfd9 Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.585955 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.586017 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.586030 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.586053 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.586067 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:42Z","lastTransitionTime":"2026-03-18T15:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.597076 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:42 crc kubenswrapper[4696]: E0318 15:37:42.597210 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.597208 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:42 crc kubenswrapper[4696]: E0318 15:37:42.597477 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.597612 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:42 crc kubenswrapper[4696]: E0318 15:37:42.597679 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.689902 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.689950 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.689963 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.689983 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.689997 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:42Z","lastTransitionTime":"2026-03-18T15:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.792018 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.792069 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.792084 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.792099 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.792109 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:42Z","lastTransitionTime":"2026-03-18T15:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.895116 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.895192 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.895215 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.895245 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.895265 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:42Z","lastTransitionTime":"2026-03-18T15:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.998487 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.998588 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.998621 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.998648 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:42 crc kubenswrapper[4696]: I0318 15:37:42.998665 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:42Z","lastTransitionTime":"2026-03-18T15:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.025530 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8l8zp" event={"ID":"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7","Type":"ContainerStarted","Data":"1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa"} Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.027462 4696 generic.go:334] "Generic (PLEG): container finished" podID="4e865105-7459-4de0-ade2-9bac1ff5f094" containerID="eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256" exitCode=0 Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.027491 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" event={"ID":"4e865105-7459-4de0-ade2-9bac1ff5f094","Type":"ContainerDied","Data":"eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256"} Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.027539 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" event={"ID":"4e865105-7459-4de0-ade2-9bac1ff5f094","Type":"ContainerStarted","Data":"c8e27058d13129b7a9dc64143ee1bf825f24d888506c2c390b15d9c5db951ab2"} Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.029308 4696 generic.go:334] "Generic (PLEG): container finished" podID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerID="731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854" exitCode=0 Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.029414 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" event={"ID":"1dc15d44-2b63-40b8-b9c8-dad533d01710","Type":"ContainerDied","Data":"731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854"} Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.029508 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" event={"ID":"1dc15d44-2b63-40b8-b9c8-dad533d01710","Type":"ContainerStarted","Data":"b7d855baf66edc8258ad5042e989a341246a8d46a831759090946f8f742fdfd9"} Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.032655 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w9dbn" event={"ID":"49424478-cad5-4788-b01e-4ebde47480e1","Type":"ContainerStarted","Data":"684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049"} Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.032684 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w9dbn" event={"ID":"49424478-cad5-4788-b01e-4ebde47480e1","Type":"ContainerStarted","Data":"8b847e287da85d4be15f094111dcf055de58394a22df69ff449065a32f16c4a9"} Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.038242 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerStarted","Data":"3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3"} Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.038324 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerStarted","Data":"cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05"} Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.038340 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerStarted","Data":"d70875b6fc7b635ab6de0191ddcec447fdf3a9bd32824b947d1d90f71921450c"} Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.042632 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.067629 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.083476 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.096274 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.101551 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.101639 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.101656 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.101683 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.101700 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:43Z","lastTransitionTime":"2026-03-18T15:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.108678 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.134581 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.153091 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.171182 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.188433 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.205035 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.205101 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.205118 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.205138 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.205150 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:43Z","lastTransitionTime":"2026-03-18T15:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.220109 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.235349 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.313457 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.317049 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.317085 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.317101 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.317125 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.317138 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:43Z","lastTransitionTime":"2026-03-18T15:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.325982 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.349249 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.371673 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.393828 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.410460 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.427373 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.427766 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.427946 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.428042 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.428163 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.428256 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:43Z","lastTransitionTime":"2026-03-18T15:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.440635 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.456231 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.470704 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.483739 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.495497 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.507283 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.519024 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.531502 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.532054 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.532087 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.532099 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.532120 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.532136 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:43Z","lastTransitionTime":"2026-03-18T15:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.634759 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.634948 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.635021 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.635132 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.635206 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:43Z","lastTransitionTime":"2026-03-18T15:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.738199 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.738359 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.738418 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.738484 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.738551 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:43Z","lastTransitionTime":"2026-03-18T15:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.844163 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.844681 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.844694 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.844714 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.844727 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:43Z","lastTransitionTime":"2026-03-18T15:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.946935 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.946975 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.946983 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.946999 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:43 crc kubenswrapper[4696]: I0318 15:37:43.947009 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:43Z","lastTransitionTime":"2026-03-18T15:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.046357 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" event={"ID":"1dc15d44-2b63-40b8-b9c8-dad533d01710","Type":"ContainerStarted","Data":"e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38"} Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.046409 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" event={"ID":"1dc15d44-2b63-40b8-b9c8-dad533d01710","Type":"ContainerStarted","Data":"275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc"} Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.046419 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" event={"ID":"1dc15d44-2b63-40b8-b9c8-dad533d01710","Type":"ContainerStarted","Data":"3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e"} Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.046429 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" event={"ID":"1dc15d44-2b63-40b8-b9c8-dad533d01710","Type":"ContainerStarted","Data":"e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629"} Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.049335 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" event={"ID":"4e865105-7459-4de0-ade2-9bac1ff5f094","Type":"ContainerStarted","Data":"50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef"} Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.049492 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.049524 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.049536 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.049584 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.049598 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:44Z","lastTransitionTime":"2026-03-18T15:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.065905 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.081170 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.096551 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.112806 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.129659 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.149050 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.151809 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.151858 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.151874 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.151895 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.151908 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:44Z","lastTransitionTime":"2026-03-18T15:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.171615 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.186642 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.200686 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.216767 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.230957 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.242161 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.253265 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.255427 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.255519 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.255614 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.255633 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.255644 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:44Z","lastTransitionTime":"2026-03-18T15:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.358809 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.358870 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.358910 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.358933 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.358948 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:44Z","lastTransitionTime":"2026-03-18T15:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.461702 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.461770 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.461782 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.461803 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.461821 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:44Z","lastTransitionTime":"2026-03-18T15:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.565363 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.565422 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.565440 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.565466 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.565481 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:44Z","lastTransitionTime":"2026-03-18T15:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.596901 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.596983 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.597080 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:44 crc kubenswrapper[4696]: E0318 15:37:44.597308 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:44 crc kubenswrapper[4696]: E0318 15:37:44.597492 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:44 crc kubenswrapper[4696]: E0318 15:37:44.597670 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.668731 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.668806 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.668825 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.668855 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.668875 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:44Z","lastTransitionTime":"2026-03-18T15:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.772024 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.772100 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.772122 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.772157 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.772180 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:44Z","lastTransitionTime":"2026-03-18T15:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.875487 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.875589 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.875607 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.875629 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.875643 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:44Z","lastTransitionTime":"2026-03-18T15:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.978760 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.978818 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.978831 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.978858 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:44 crc kubenswrapper[4696]: I0318 15:37:44.978893 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:44Z","lastTransitionTime":"2026-03-18T15:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.055542 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" event={"ID":"1dc15d44-2b63-40b8-b9c8-dad533d01710","Type":"ContainerStarted","Data":"465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272"} Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.055604 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" event={"ID":"1dc15d44-2b63-40b8-b9c8-dad533d01710","Type":"ContainerStarted","Data":"9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa"} Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.057625 4696 generic.go:334] "Generic (PLEG): container finished" podID="4e865105-7459-4de0-ade2-9bac1ff5f094" containerID="50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef" exitCode=0 Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.057660 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" event={"ID":"4e865105-7459-4de0-ade2-9bac1ff5f094","Type":"ContainerDied","Data":"50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef"} Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.076428 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:45Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.090174 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.090213 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.090224 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.090238 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.090246 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:45Z","lastTransitionTime":"2026-03-18T15:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.093496 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:45Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.112401 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:45Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.148658 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:45Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.170835 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:45Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.193231 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.193284 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.193297 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.193316 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.193330 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:45Z","lastTransitionTime":"2026-03-18T15:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.193997 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:45Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.212089 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:45Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.232917 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:45Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.251080 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:45Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.262365 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:45Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.273609 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:45Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.292271 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:45Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.297179 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.297216 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.297228 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.297244 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.297254 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:45Z","lastTransitionTime":"2026-03-18T15:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.311369 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:45Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.399394 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.399439 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.399451 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.399470 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.399481 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:45Z","lastTransitionTime":"2026-03-18T15:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.501745 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.501779 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.501787 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.501801 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.501810 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:45Z","lastTransitionTime":"2026-03-18T15:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.597653 4696 scope.go:117] "RemoveContainer" containerID="9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.603408 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.603437 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.603445 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.603461 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.603472 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:45Z","lastTransitionTime":"2026-03-18T15:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.705796 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.705840 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.705851 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.705868 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.705881 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:45Z","lastTransitionTime":"2026-03-18T15:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.808117 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.808159 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.808168 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.808182 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.808191 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:45Z","lastTransitionTime":"2026-03-18T15:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.911885 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.911964 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.911982 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.912014 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:45 crc kubenswrapper[4696]: I0318 15:37:45.912057 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:45Z","lastTransitionTime":"2026-03-18T15:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.015114 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.015177 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.015192 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.015215 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.015232 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:46Z","lastTransitionTime":"2026-03-18T15:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.064275 4696 generic.go:334] "Generic (PLEG): container finished" podID="4e865105-7459-4de0-ade2-9bac1ff5f094" containerID="da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b" exitCode=0 Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.064342 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" event={"ID":"4e865105-7459-4de0-ade2-9bac1ff5f094","Type":"ContainerDied","Data":"da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b"} Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.067192 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.069696 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca"} Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.069941 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.097860 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.115850 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.118934 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.118998 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.119010 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.119036 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.119066 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:46Z","lastTransitionTime":"2026-03-18T15:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.130434 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.146731 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.166892 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.182362 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.193917 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.207097 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.220794 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.221146 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.221190 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.221200 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.221216 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.221225 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:46Z","lastTransitionTime":"2026-03-18T15:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.242023 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.257866 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.273240 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.284877 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.299185 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.310832 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.323227 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.323961 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.323995 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.324004 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.324038 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.324049 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:46Z","lastTransitionTime":"2026-03-18T15:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.342784 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.357615 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.370274 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.380931 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.395936 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.416045 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.427097 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.427156 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.427169 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.427189 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.427204 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:46Z","lastTransitionTime":"2026-03-18T15:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.431759 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.445356 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.464176 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.481723 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.529719 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.529754 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.529762 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.529777 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.529786 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:46Z","lastTransitionTime":"2026-03-18T15:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.596744 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.596751 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:46 crc kubenswrapper[4696]: E0318 15:37:46.596909 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.596751 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:46 crc kubenswrapper[4696]: E0318 15:37:46.597048 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:46 crc kubenswrapper[4696]: E0318 15:37:46.597093 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.632126 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.632168 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.632177 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.632211 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.632222 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:46Z","lastTransitionTime":"2026-03-18T15:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.734320 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.734357 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.734369 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.734381 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.734390 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:46Z","lastTransitionTime":"2026-03-18T15:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.837502 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.837824 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.837838 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.837853 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.837866 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:46Z","lastTransitionTime":"2026-03-18T15:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.940691 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.940785 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.940801 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.940826 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:46 crc kubenswrapper[4696]: I0318 15:37:46.940846 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:46Z","lastTransitionTime":"2026-03-18T15:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.044204 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.044257 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.044270 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.044287 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.044300 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:47Z","lastTransitionTime":"2026-03-18T15:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.076301 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" event={"ID":"1dc15d44-2b63-40b8-b9c8-dad533d01710","Type":"ContainerStarted","Data":"e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2"} Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.079233 4696 generic.go:334] "Generic (PLEG): container finished" podID="4e865105-7459-4de0-ade2-9bac1ff5f094" containerID="1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86" exitCode=0 Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.079328 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" event={"ID":"4e865105-7459-4de0-ade2-9bac1ff5f094","Type":"ContainerDied","Data":"1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86"} Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.095310 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.108688 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.126896 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.141216 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.146486 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.146605 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.146627 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.146659 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.146682 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:47Z","lastTransitionTime":"2026-03-18T15:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.154436 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.167972 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.180454 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.199101 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.225199 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.240799 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.249383 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.249416 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.249429 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.249446 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.249457 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:47Z","lastTransitionTime":"2026-03-18T15:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.254096 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.264501 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.275405 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.352747 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.352791 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.352801 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.352818 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.352828 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:47Z","lastTransitionTime":"2026-03-18T15:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.455557 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.455608 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.455620 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.455637 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.455646 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:47Z","lastTransitionTime":"2026-03-18T15:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.558405 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.558450 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.558461 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.558478 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.558492 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:47Z","lastTransitionTime":"2026-03-18T15:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.619668 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.637877 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.661219 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.664131 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.664182 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.664192 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.664213 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.664224 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:47Z","lastTransitionTime":"2026-03-18T15:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.681122 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.701205 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.718249 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.737479 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.750763 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.766156 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.766247 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.766266 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.766289 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.766303 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:47Z","lastTransitionTime":"2026-03-18T15:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.771717 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.787449 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.803459 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.817496 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.838325 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.868608 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.868676 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.868692 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.868718 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.868738 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:47Z","lastTransitionTime":"2026-03-18T15:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.971713 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.971782 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.971797 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.971820 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:47 crc kubenswrapper[4696]: I0318 15:37:47.971833 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:47Z","lastTransitionTime":"2026-03-18T15:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.046314 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.046389 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.046405 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.046424 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.046435 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:48Z","lastTransitionTime":"2026-03-18T15:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:48 crc kubenswrapper[4696]: E0318 15:37:48.062225 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.066403 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.066446 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.066457 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.066478 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.066493 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:48Z","lastTransitionTime":"2026-03-18T15:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:48 crc kubenswrapper[4696]: E0318 15:37:48.083667 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.087468 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.087519 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.087542 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.087563 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.087586 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:48Z","lastTransitionTime":"2026-03-18T15:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.088086 4696 generic.go:334] "Generic (PLEG): container finished" podID="4e865105-7459-4de0-ade2-9bac1ff5f094" containerID="58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213" exitCode=0 Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.088152 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" event={"ID":"4e865105-7459-4de0-ade2-9bac1ff5f094","Type":"ContainerDied","Data":"58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213"} Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.105542 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: E0318 15:37:48.111673 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.116398 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.116431 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.116440 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.116457 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.116468 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:48Z","lastTransitionTime":"2026-03-18T15:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.127266 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: E0318 15:37:48.128339 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.133090 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.133138 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.133149 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.133167 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.133179 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:48Z","lastTransitionTime":"2026-03-18T15:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.143504 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: E0318 15:37:48.147626 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: E0318 15:37:48.147808 4696 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.149479 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.149517 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.149532 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.149567 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.149580 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:48Z","lastTransitionTime":"2026-03-18T15:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.168356 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.186576 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.204818 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.220888 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.237044 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.238575 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-kxs7v"] Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.239166 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kxs7v" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.242862 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.243024 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.243118 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.243417 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.252390 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.252439 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.252449 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.252465 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.252478 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:48Z","lastTransitionTime":"2026-03-18T15:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.261760 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.281934 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.297289 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.311761 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.326745 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.336365 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.349797 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.355262 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.355325 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.355339 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.355381 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.355398 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:48Z","lastTransitionTime":"2026-03-18T15:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.363355 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.366027 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1dd9d1ab-5bb8-4005-bf13-be7a7620e92f-host\") pod \"node-ca-kxs7v\" (UID: \"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\") " pod="openshift-image-registry/node-ca-kxs7v" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.366149 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1dd9d1ab-5bb8-4005-bf13-be7a7620e92f-serviceca\") pod \"node-ca-kxs7v\" (UID: \"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\") " pod="openshift-image-registry/node-ca-kxs7v" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.366194 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gccsp\" (UniqueName: \"kubernetes.io/projected/1dd9d1ab-5bb8-4005-bf13-be7a7620e92f-kube-api-access-gccsp\") pod \"node-ca-kxs7v\" (UID: \"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\") " pod="openshift-image-registry/node-ca-kxs7v" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.378013 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.395086 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.407175 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kxs7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gccsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kxs7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.426310 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.440454 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.457004 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.458478 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.458557 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.458574 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.458600 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.458617 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:48Z","lastTransitionTime":"2026-03-18T15:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.467598 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gccsp\" (UniqueName: \"kubernetes.io/projected/1dd9d1ab-5bb8-4005-bf13-be7a7620e92f-kube-api-access-gccsp\") pod \"node-ca-kxs7v\" (UID: \"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\") " pod="openshift-image-registry/node-ca-kxs7v" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.467683 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1dd9d1ab-5bb8-4005-bf13-be7a7620e92f-host\") pod \"node-ca-kxs7v\" (UID: \"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\") " pod="openshift-image-registry/node-ca-kxs7v" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.467728 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1dd9d1ab-5bb8-4005-bf13-be7a7620e92f-serviceca\") pod \"node-ca-kxs7v\" (UID: \"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\") " pod="openshift-image-registry/node-ca-kxs7v" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.467769 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1dd9d1ab-5bb8-4005-bf13-be7a7620e92f-host\") pod \"node-ca-kxs7v\" (UID: \"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\") " pod="openshift-image-registry/node-ca-kxs7v" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.469026 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1dd9d1ab-5bb8-4005-bf13-be7a7620e92f-serviceca\") pod \"node-ca-kxs7v\" (UID: \"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\") " pod="openshift-image-registry/node-ca-kxs7v" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.472134 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.485846 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gccsp\" (UniqueName: \"kubernetes.io/projected/1dd9d1ab-5bb8-4005-bf13-be7a7620e92f-kube-api-access-gccsp\") pod \"node-ca-kxs7v\" (UID: \"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\") " pod="openshift-image-registry/node-ca-kxs7v" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.492889 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.523644 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.541964 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.556933 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.560208 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kxs7v" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.562576 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.562610 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.562623 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.562644 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.562659 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:48Z","lastTransitionTime":"2026-03-18T15:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.597043 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.597107 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.597073 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:48 crc kubenswrapper[4696]: E0318 15:37:48.597245 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:48 crc kubenswrapper[4696]: E0318 15:37:48.597296 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:48 crc kubenswrapper[4696]: E0318 15:37:48.597383 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.670010 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.670060 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.670073 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.670094 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.670107 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:48Z","lastTransitionTime":"2026-03-18T15:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.772684 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.772700 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.772717 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:48 crc kubenswrapper[4696]: E0318 15:37:48.772859 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:20.772834941 +0000 UTC m=+143.779009147 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.772895 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.772908 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.772940 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.772918 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:48 crc kubenswrapper[4696]: E0318 15:37:48.773033 4696 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:37:48 crc kubenswrapper[4696]: E0318 15:37:48.773098 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:20.773077906 +0000 UTC m=+143.779252112 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.773018 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:48Z","lastTransitionTime":"2026-03-18T15:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:48 crc kubenswrapper[4696]: E0318 15:37:48.773127 4696 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:37:48 crc kubenswrapper[4696]: E0318 15:37:48.773261 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:20.77324981 +0000 UTC m=+143.779424016 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.874004 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.874049 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:48 crc kubenswrapper[4696]: E0318 15:37:48.874164 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:37:48 crc kubenswrapper[4696]: E0318 15:37:48.874179 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:37:48 crc kubenswrapper[4696]: E0318 15:37:48.874189 4696 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:48 crc kubenswrapper[4696]: E0318 15:37:48.874241 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:20.874227844 +0000 UTC m=+143.880402050 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:48 crc kubenswrapper[4696]: E0318 15:37:48.874300 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:37:48 crc kubenswrapper[4696]: E0318 15:37:48.874361 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:37:48 crc kubenswrapper[4696]: E0318 15:37:48.874379 4696 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:48 crc kubenswrapper[4696]: E0318 15:37:48.874686 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:20.874655214 +0000 UTC m=+143.880829430 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.876580 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.876618 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.876628 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.876648 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.876659 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:48Z","lastTransitionTime":"2026-03-18T15:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.983009 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.983056 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.983066 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.983086 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:48 crc kubenswrapper[4696]: I0318 15:37:48.983098 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:48Z","lastTransitionTime":"2026-03-18T15:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.085945 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.085990 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.086001 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.086023 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.086053 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:49Z","lastTransitionTime":"2026-03-18T15:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.097439 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" event={"ID":"1dc15d44-2b63-40b8-b9c8-dad533d01710","Type":"ContainerStarted","Data":"f84d84884403c68a6ba2fb5df5b4db2e31aac4ee904ce7cc7d8c544da609146b"} Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.098481 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.102470 4696 generic.go:334] "Generic (PLEG): container finished" podID="4e865105-7459-4de0-ade2-9bac1ff5f094" containerID="642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139" exitCode=0 Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.102546 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" event={"ID":"4e865105-7459-4de0-ade2-9bac1ff5f094","Type":"ContainerDied","Data":"642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139"} Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.105067 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kxs7v" event={"ID":"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f","Type":"ContainerStarted","Data":"c644501d5f60499ba8bae2b198471b512cdbd1f13eb1c2e628e88090f1118658"} Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.105099 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kxs7v" event={"ID":"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f","Type":"ContainerStarted","Data":"35980b08e99c69f0241edddd499d5a85ce8063fcada89241f8ec9fe2506a5cc5"} Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.123081 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.131445 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.139003 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.151777 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kxs7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gccsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kxs7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.166819 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.183160 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.189672 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.189724 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.189737 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.189808 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.189822 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:49Z","lastTransitionTime":"2026-03-18T15:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.200800 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.216503 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.237848 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f84d84884403c68a6ba2fb5df5b4db2e31aac4ee904ce7cc7d8c544da609146b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.263340 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.281325 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.295068 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.295115 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.295125 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.295139 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.295149 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:49Z","lastTransitionTime":"2026-03-18T15:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.298279 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.315371 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.330622 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.349111 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.381793 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.399598 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.400162 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.400191 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.400221 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.400241 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:49Z","lastTransitionTime":"2026-03-18T15:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.404595 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.425530 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.446374 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.479701 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f84d84884403c68a6ba2fb5df5b4db2e31aac4ee904ce7cc7d8c544da609146b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.497000 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.502248 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.502313 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.502332 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.502360 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.502377 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:49Z","lastTransitionTime":"2026-03-18T15:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.509708 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.526937 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.542857 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.560725 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.580430 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.595286 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.604614 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.604677 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.604698 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.604767 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.604787 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:49Z","lastTransitionTime":"2026-03-18T15:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.614085 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.630566 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kxs7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644501d5f60499ba8bae2b198471b512cdbd1f13eb1c2e628e88090f1118658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gccsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kxs7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.708385 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.708453 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.708467 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.708486 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.708497 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:49Z","lastTransitionTime":"2026-03-18T15:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.810905 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.810955 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.810967 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.810985 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.811005 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:49Z","lastTransitionTime":"2026-03-18T15:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.913480 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.913557 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.913566 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.913583 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:49 crc kubenswrapper[4696]: I0318 15:37:49.913594 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:49Z","lastTransitionTime":"2026-03-18T15:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.017005 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.017083 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.017102 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.017133 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.017156 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:50Z","lastTransitionTime":"2026-03-18T15:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.113253 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" event={"ID":"4e865105-7459-4de0-ade2-9bac1ff5f094","Type":"ContainerStarted","Data":"f68a69dcd4ef157c15cdd54345e7a27def2a9e7f374d8053fd9f37c9b0cd120a"} Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.114063 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.114121 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.119863 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.119905 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.119916 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.119935 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.119948 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:50Z","lastTransitionTime":"2026-03-18T15:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.136262 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.152753 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.153847 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68a69dcd4ef157c15cdd54345e7a27def2a9e7f374d8053fd9f37c9b0cd120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.175311 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.191701 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.209107 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.222139 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kxs7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644501d5f60499ba8bae2b198471b512cdbd1f13eb1c2e628e88090f1118658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gccsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kxs7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.222615 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.222691 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.222709 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.222736 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.222752 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:50Z","lastTransitionTime":"2026-03-18T15:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.245345 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.264610 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.281615 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.294608 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.314832 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f84d84884403c68a6ba2fb5df5b4db2e31aac4ee904ce7cc7d8c544da609146b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.326184 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.326240 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.326260 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.326287 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.326306 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:50Z","lastTransitionTime":"2026-03-18T15:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.331064 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.351029 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.366265 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.381840 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.397087 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.413757 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.429753 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.430926 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.430962 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.430991 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.431009 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.431020 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:50Z","lastTransitionTime":"2026-03-18T15:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.445399 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68a69dcd4ef157c15cdd54345e7a27def2a9e7f374d8053fd9f37c9b0cd120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.461010 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.473634 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.486018 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.497699 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kxs7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644501d5f60499ba8bae2b198471b512cdbd1f13eb1c2e628e88090f1118658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gccsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kxs7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.522235 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.534836 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.534882 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.534891 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.534907 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.534918 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:50Z","lastTransitionTime":"2026-03-18T15:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.536833 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.551983 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.566425 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.585047 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f84d84884403c68a6ba2fb5df5b4db2e31aac4ee904ce7cc7d8c544da609146b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.597348 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:50 crc kubenswrapper[4696]: E0318 15:37:50.597539 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.597789 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:50 crc kubenswrapper[4696]: E0318 15:37:50.597864 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.598236 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:50 crc kubenswrapper[4696]: E0318 15:37:50.598304 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.638070 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.638140 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.638155 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.638179 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.638201 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:50Z","lastTransitionTime":"2026-03-18T15:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.740698 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.740746 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.740773 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.740788 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.740799 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:50Z","lastTransitionTime":"2026-03-18T15:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.843251 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.843297 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.843309 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.843325 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.843337 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:50Z","lastTransitionTime":"2026-03-18T15:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.946101 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.946195 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.946210 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.946232 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:50 crc kubenswrapper[4696]: I0318 15:37:50.946244 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:50Z","lastTransitionTime":"2026-03-18T15:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.048586 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.048663 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.048681 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.048709 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.048728 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:51Z","lastTransitionTime":"2026-03-18T15:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.151189 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.151245 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.151258 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.151273 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.151283 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:51Z","lastTransitionTime":"2026-03-18T15:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.253444 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.253481 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.253489 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.253501 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.253510 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:51Z","lastTransitionTime":"2026-03-18T15:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.356471 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.356508 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.356516 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.356546 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.356556 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:51Z","lastTransitionTime":"2026-03-18T15:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.458503 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.458566 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.458578 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.458596 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.458609 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:51Z","lastTransitionTime":"2026-03-18T15:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.560475 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.560509 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.560533 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.560547 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.560557 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:51Z","lastTransitionTime":"2026-03-18T15:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.663209 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.663263 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.663275 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.663293 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.663307 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:51Z","lastTransitionTime":"2026-03-18T15:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.765369 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.765423 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.765446 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.765465 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.765478 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:51Z","lastTransitionTime":"2026-03-18T15:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.867715 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.867758 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.867774 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.867791 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.867801 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:51Z","lastTransitionTime":"2026-03-18T15:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.970404 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.970461 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.970473 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.970488 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:51 crc kubenswrapper[4696]: I0318 15:37:51.970496 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:51Z","lastTransitionTime":"2026-03-18T15:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.072927 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.072985 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.072997 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.073013 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.073033 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:52Z","lastTransitionTime":"2026-03-18T15:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.120784 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lqxgs_1dc15d44-2b63-40b8-b9c8-dad533d01710/ovnkube-controller/0.log" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.123311 4696 generic.go:334] "Generic (PLEG): container finished" podID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerID="f84d84884403c68a6ba2fb5df5b4db2e31aac4ee904ce7cc7d8c544da609146b" exitCode=1 Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.123350 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" event={"ID":"1dc15d44-2b63-40b8-b9c8-dad533d01710","Type":"ContainerDied","Data":"f84d84884403c68a6ba2fb5df5b4db2e31aac4ee904ce7cc7d8c544da609146b"} Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.123945 4696 scope.go:117] "RemoveContainer" containerID="f84d84884403c68a6ba2fb5df5b4db2e31aac4ee904ce7cc7d8c544da609146b" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.146326 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.164436 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.176475 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.176521 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.176547 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.176567 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.176599 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:52Z","lastTransitionTime":"2026-03-18T15:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.182468 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.202287 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.220990 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f84d84884403c68a6ba2fb5df5b4db2e31aac4ee904ce7cc7d8c544da609146b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84d84884403c68a6ba2fb5df5b4db2e31aac4ee904ce7cc7d8c544da609146b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:37:51Z\\\",\\\"message\\\":\\\"nformers/externalversions/factory.go:140\\\\nI0318 15:37:51.446452 6560 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:37:51.447062 6560 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0318 15:37:51.447080 6560 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0318 15:37:51.447119 6560 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 15:37:51.447128 6560 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 15:37:51.447136 6560 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 15:37:51.447137 6560 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 15:37:51.447148 6560 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 15:37:51.447158 6560 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 15:37:51.447177 6560 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 15:37:51.447198 6560 factory.go:656] Stopping watch factory\\\\nI0318 15:37:51.447197 6560 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 15:37:51.447216 6560 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:37:51.447218 6560 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.234014 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.250378 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.264281 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.281031 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.281134 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.281154 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.281887 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.281935 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:52Z","lastTransitionTime":"2026-03-18T15:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.282401 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.299335 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68a69dcd4ef157c15cdd54345e7a27def2a9e7f374d8053fd9f37c9b0cd120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.312948 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.327763 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.343889 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.354236 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kxs7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644501d5f60499ba8bae2b198471b512cdbd1f13eb1c2e628e88090f1118658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gccsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kxs7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.385409 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.385448 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.385461 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.385481 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.385494 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:52Z","lastTransitionTime":"2026-03-18T15:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.488857 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.488897 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.488908 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.488923 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.488935 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:52Z","lastTransitionTime":"2026-03-18T15:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.591438 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.591479 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.591489 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.591504 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.591514 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:52Z","lastTransitionTime":"2026-03-18T15:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.596814 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.596818 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.596976 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:52 crc kubenswrapper[4696]: E0318 15:37:52.597062 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:52 crc kubenswrapper[4696]: E0318 15:37:52.597145 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:52 crc kubenswrapper[4696]: E0318 15:37:52.597204 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.693800 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.693851 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.693863 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.693880 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.693891 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:52Z","lastTransitionTime":"2026-03-18T15:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.796402 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.796450 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.796459 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.796475 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.796489 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:52Z","lastTransitionTime":"2026-03-18T15:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.898895 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.898936 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.898946 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.898964 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:52 crc kubenswrapper[4696]: I0318 15:37:52.898976 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:52Z","lastTransitionTime":"2026-03-18T15:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.001610 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.001659 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.001671 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.001693 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.001706 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:53Z","lastTransitionTime":"2026-03-18T15:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.104612 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.104654 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.104664 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.104680 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.104691 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:53Z","lastTransitionTime":"2026-03-18T15:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.129550 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lqxgs_1dc15d44-2b63-40b8-b9c8-dad533d01710/ovnkube-controller/1.log" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.130303 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lqxgs_1dc15d44-2b63-40b8-b9c8-dad533d01710/ovnkube-controller/0.log" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.135147 4696 generic.go:334] "Generic (PLEG): container finished" podID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerID="d5fea0d831c8304cb16b97fc1052c412edb0e7c94ad8f47e1a8db4b435f16d75" exitCode=1 Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.135203 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" event={"ID":"1dc15d44-2b63-40b8-b9c8-dad533d01710","Type":"ContainerDied","Data":"d5fea0d831c8304cb16b97fc1052c412edb0e7c94ad8f47e1a8db4b435f16d75"} Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.135266 4696 scope.go:117] "RemoveContainer" containerID="f84d84884403c68a6ba2fb5df5b4db2e31aac4ee904ce7cc7d8c544da609146b" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.136428 4696 scope.go:117] "RemoveContainer" containerID="d5fea0d831c8304cb16b97fc1052c412edb0e7c94ad8f47e1a8db4b435f16d75" Mar 18 15:37:53 crc kubenswrapper[4696]: E0318 15:37:53.136766 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lqxgs_openshift-ovn-kubernetes(1dc15d44-2b63-40b8-b9c8-dad533d01710)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.150687 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.174699 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fea0d831c8304cb16b97fc1052c412edb0e7c94ad8f47e1a8db4b435f16d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84d84884403c68a6ba2fb5df5b4db2e31aac4ee904ce7cc7d8c544da609146b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:37:51Z\\\",\\\"message\\\":\\\"nformers/externalversions/factory.go:140\\\\nI0318 15:37:51.446452 6560 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:37:51.447062 6560 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0318 15:37:51.447080 6560 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0318 15:37:51.447119 6560 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 15:37:51.447128 6560 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 15:37:51.447136 6560 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 15:37:51.447137 6560 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 15:37:51.447148 6560 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 15:37:51.447158 6560 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 15:37:51.447177 6560 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 15:37:51.447198 6560 factory.go:656] Stopping watch factory\\\\nI0318 15:37:51.447197 6560 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 15:37:51.447216 6560 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:37:51.447218 6560 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5fea0d831c8304cb16b97fc1052c412edb0e7c94ad8f47e1a8db4b435f16d75\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:37:53Z\\\",\\\"message\\\":\\\"c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.195410 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.207360 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.207455 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.207476 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.207510 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.207568 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:53Z","lastTransitionTime":"2026-03-18T15:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.212672 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.226582 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.239284 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.255022 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.268707 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.285115 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.302107 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68a69dcd4ef157c15cdd54345e7a27def2a9e7f374d8053fd9f37c9b0cd120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.310569 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.310610 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.310621 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.310639 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.310665 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:53Z","lastTransitionTime":"2026-03-18T15:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.316030 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kxs7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644501d5f60499ba8bae2b198471b512cdbd1f13eb1c2e628e88090f1118658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gccsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kxs7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.330558 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.345318 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.356945 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.412563 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.413285 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.413345 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.413377 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.413400 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:53Z","lastTransitionTime":"2026-03-18T15:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.516141 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.516180 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.516189 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.516202 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.516211 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:53Z","lastTransitionTime":"2026-03-18T15:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.618853 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.618888 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.618896 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.618908 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.618916 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:53Z","lastTransitionTime":"2026-03-18T15:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.726110 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.726153 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.726173 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.726189 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.726200 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:53Z","lastTransitionTime":"2026-03-18T15:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.828534 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.828598 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.828610 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.828629 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.828641 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:53Z","lastTransitionTime":"2026-03-18T15:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.932233 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.932308 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.932331 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.932364 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:53 crc kubenswrapper[4696]: I0318 15:37:53.932389 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:53Z","lastTransitionTime":"2026-03-18T15:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.035271 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.035316 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.035327 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.035346 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.035359 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:54Z","lastTransitionTime":"2026-03-18T15:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.140994 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.141085 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.141104 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.141131 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.141150 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:54Z","lastTransitionTime":"2026-03-18T15:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.142131 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lqxgs_1dc15d44-2b63-40b8-b9c8-dad533d01710/ovnkube-controller/1.log" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.149230 4696 scope.go:117] "RemoveContainer" containerID="d5fea0d831c8304cb16b97fc1052c412edb0e7c94ad8f47e1a8db4b435f16d75" Mar 18 15:37:54 crc kubenswrapper[4696]: E0318 15:37:54.149523 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lqxgs_openshift-ovn-kubernetes(1dc15d44-2b63-40b8-b9c8-dad533d01710)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.175050 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.193701 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.215355 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.226321 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9"] Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.226781 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.229281 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.230344 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.236374 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fea0d831c8304cb16b97fc1052c412edb0e7c94ad8f47e1a8db4b435f16d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5fea0d831c8304cb16b97fc1052c412edb0e7c94ad8f47e1a8db4b435f16d75\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:37:53Z\\\",\\\"message\\\":\\\"c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lqxgs_openshift-ovn-kubernetes(1dc15d44-2b63-40b8-b9c8-dad533d01710)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.243363 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.243405 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.243423 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.243446 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.243458 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:54Z","lastTransitionTime":"2026-03-18T15:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.259661 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.272831 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.284392 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.297454 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.310820 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.326693 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68a69dcd4ef157c15cdd54345e7a27def2a9e7f374d8053fd9f37c9b0cd120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.333927 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e691f542-103a-4e95-966d-dbb5ab5ddaee-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-cvzw9\" (UID: \"e691f542-103a-4e95-966d-dbb5ab5ddaee\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.333970 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e691f542-103a-4e95-966d-dbb5ab5ddaee-env-overrides\") pod \"ovnkube-control-plane-749d76644c-cvzw9\" (UID: \"e691f542-103a-4e95-966d-dbb5ab5ddaee\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.334040 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4lmx\" (UniqueName: \"kubernetes.io/projected/e691f542-103a-4e95-966d-dbb5ab5ddaee-kube-api-access-c4lmx\") pod \"ovnkube-control-plane-749d76644c-cvzw9\" (UID: \"e691f542-103a-4e95-966d-dbb5ab5ddaee\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.334058 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e691f542-103a-4e95-966d-dbb5ab5ddaee-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cvzw9\" (UID: \"e691f542-103a-4e95-966d-dbb5ab5ddaee\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.340102 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.345766 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.345814 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.345824 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.345844 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.345858 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:54Z","lastTransitionTime":"2026-03-18T15:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.353290 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.362666 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kxs7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644501d5f60499ba8bae2b198471b512cdbd1f13eb1c2e628e88090f1118658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gccsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kxs7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.375162 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.388787 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.399920 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.411207 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.422611 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kxs7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644501d5f60499ba8bae2b198471b512cdbd1f13eb1c2e628e88090f1118658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gccsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kxs7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.435763 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e691f542-103a-4e95-966d-dbb5ab5ddaee-env-overrides\") pod \"ovnkube-control-plane-749d76644c-cvzw9\" (UID: \"e691f542-103a-4e95-966d-dbb5ab5ddaee\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.435864 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e691f542-103a-4e95-966d-dbb5ab5ddaee-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cvzw9\" (UID: \"e691f542-103a-4e95-966d-dbb5ab5ddaee\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.435897 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4lmx\" (UniqueName: \"kubernetes.io/projected/e691f542-103a-4e95-966d-dbb5ab5ddaee-kube-api-access-c4lmx\") pod \"ovnkube-control-plane-749d76644c-cvzw9\" (UID: \"e691f542-103a-4e95-966d-dbb5ab5ddaee\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.435948 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e691f542-103a-4e95-966d-dbb5ab5ddaee-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-cvzw9\" (UID: \"e691f542-103a-4e95-966d-dbb5ab5ddaee\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.436452 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e691f542-103a-4e95-966d-dbb5ab5ddaee-env-overrides\") pod \"ovnkube-control-plane-749d76644c-cvzw9\" (UID: \"e691f542-103a-4e95-966d-dbb5ab5ddaee\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.436896 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e691f542-103a-4e95-966d-dbb5ab5ddaee-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-cvzw9\" (UID: \"e691f542-103a-4e95-966d-dbb5ab5ddaee\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.445698 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e691f542-103a-4e95-966d-dbb5ab5ddaee-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cvzw9\" (UID: \"e691f542-103a-4e95-966d-dbb5ab5ddaee\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.446390 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.448690 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.448716 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.448727 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.448784 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.448800 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:54Z","lastTransitionTime":"2026-03-18T15:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.458243 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4lmx\" (UniqueName: \"kubernetes.io/projected/e691f542-103a-4e95-966d-dbb5ab5ddaee-kube-api-access-c4lmx\") pod \"ovnkube-control-plane-749d76644c-cvzw9\" (UID: \"e691f542-103a-4e95-966d-dbb5ab5ddaee\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.463030 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.477458 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.492084 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.515194 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fea0d831c8304cb16b97fc1052c412edb0e7c94ad8f47e1a8db4b435f16d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5fea0d831c8304cb16b97fc1052c412edb0e7c94ad8f47e1a8db4b435f16d75\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:37:53Z\\\",\\\"message\\\":\\\"c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lqxgs_openshift-ovn-kubernetes(1dc15d44-2b63-40b8-b9c8-dad533d01710)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.531709 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e691f542-103a-4e95-966d-dbb5ab5ddaee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4lmx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4lmx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cvzw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.539988 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.552655 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.552693 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.552704 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.552722 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.552733 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:54Z","lastTransitionTime":"2026-03-18T15:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.552978 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.570920 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.583313 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.596491 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:54 crc kubenswrapper[4696]: E0318 15:37:54.596660 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.596865 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:54 crc kubenswrapper[4696]: E0318 15:37:54.596916 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.597032 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:54 crc kubenswrapper[4696]: E0318 15:37:54.597078 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.602107 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.619317 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68a69dcd4ef157c15cdd54345e7a27def2a9e7f374d8053fd9f37c9b0cd120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.654985 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.655015 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.655024 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.655037 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.655045 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:54Z","lastTransitionTime":"2026-03-18T15:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.758543 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.758591 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.758603 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.758618 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.758630 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:54Z","lastTransitionTime":"2026-03-18T15:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.860387 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.860420 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.860430 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.860445 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.860457 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:54Z","lastTransitionTime":"2026-03-18T15:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.963092 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.963134 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.963144 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.963161 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.963172 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:54Z","lastTransitionTime":"2026-03-18T15:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.967117 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-k88c8"] Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.967836 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:37:54 crc kubenswrapper[4696]: E0318 15:37:54.967917 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:37:54 crc kubenswrapper[4696]: I0318 15:37:54.982346 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.003655 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68a69dcd4ef157c15cdd54345e7a27def2a9e7f374d8053fd9f37c9b0cd120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.023414 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k88c8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"701f97fc-e026-4b52-ac03-e4bccbf34972\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89zn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89zn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k88c8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.042325 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.054690 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.066295 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.066340 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.066350 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.066368 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.066379 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:55Z","lastTransitionTime":"2026-03-18T15:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.066859 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.078276 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kxs7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644501d5f60499ba8bae2b198471b512cdbd1f13eb1c2e628e88090f1118658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gccsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kxs7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.092482 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e691f542-103a-4e95-966d-dbb5ab5ddaee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4lmx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4lmx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cvzw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.112857 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.132806 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.143185 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89zn9\" (UniqueName: \"kubernetes.io/projected/701f97fc-e026-4b52-ac03-e4bccbf34972-kube-api-access-89zn9\") pod \"network-metrics-daemon-k88c8\" (UID: \"701f97fc-e026-4b52-ac03-e4bccbf34972\") " pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.143265 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/701f97fc-e026-4b52-ac03-e4bccbf34972-metrics-certs\") pod \"network-metrics-daemon-k88c8\" (UID: \"701f97fc-e026-4b52-ac03-e4bccbf34972\") " pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.149886 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.154981 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" event={"ID":"e691f542-103a-4e95-966d-dbb5ab5ddaee","Type":"ContainerStarted","Data":"07232a46e61b298ee91534498283f5e0cf00c9fc6d8789b04c96d696807e3600"} Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.155077 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" event={"ID":"e691f542-103a-4e95-966d-dbb5ab5ddaee","Type":"ContainerStarted","Data":"89b2ec36f37b8482847d2b8ed7cdd8510ee939015546708b4dd36ddbc28e2efc"} Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.155110 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" event={"ID":"e691f542-103a-4e95-966d-dbb5ab5ddaee","Type":"ContainerStarted","Data":"35b4094cc1fb060b3699cfd47fa87423be5ffc225cb7de9d61d5073497b688e2"} Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.165070 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.169022 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.169064 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.169079 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.169104 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.169122 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:55Z","lastTransitionTime":"2026-03-18T15:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.185719 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fea0d831c8304cb16b97fc1052c412edb0e7c94ad8f47e1a8db4b435f16d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5fea0d831c8304cb16b97fc1052c412edb0e7c94ad8f47e1a8db4b435f16d75\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:37:53Z\\\",\\\"message\\\":\\\"c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lqxgs_openshift-ovn-kubernetes(1dc15d44-2b63-40b8-b9c8-dad533d01710)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.206026 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.217723 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.232370 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.244762 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89zn9\" (UniqueName: \"kubernetes.io/projected/701f97fc-e026-4b52-ac03-e4bccbf34972-kube-api-access-89zn9\") pod \"network-metrics-daemon-k88c8\" (UID: \"701f97fc-e026-4b52-ac03-e4bccbf34972\") " pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.244799 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/701f97fc-e026-4b52-ac03-e4bccbf34972-metrics-certs\") pod \"network-metrics-daemon-k88c8\" (UID: \"701f97fc-e026-4b52-ac03-e4bccbf34972\") " pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:37:55 crc kubenswrapper[4696]: E0318 15:37:55.245001 4696 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:37:55 crc kubenswrapper[4696]: E0318 15:37:55.245070 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/701f97fc-e026-4b52-ac03-e4bccbf34972-metrics-certs podName:701f97fc-e026-4b52-ac03-e4bccbf34972 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:55.745056647 +0000 UTC m=+118.751230853 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/701f97fc-e026-4b52-ac03-e4bccbf34972-metrics-certs") pod "network-metrics-daemon-k88c8" (UID: "701f97fc-e026-4b52-ac03-e4bccbf34972") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.249726 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.272065 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68a69dcd4ef157c15cdd54345e7a27def2a9e7f374d8053fd9f37c9b0cd120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.272196 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89zn9\" (UniqueName: \"kubernetes.io/projected/701f97fc-e026-4b52-ac03-e4bccbf34972-kube-api-access-89zn9\") pod \"network-metrics-daemon-k88c8\" (UID: \"701f97fc-e026-4b52-ac03-e4bccbf34972\") " pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.272259 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.272330 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.272351 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.272396 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.272416 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:55Z","lastTransitionTime":"2026-03-18T15:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.291792 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k88c8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"701f97fc-e026-4b52-ac03-e4bccbf34972\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89zn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89zn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k88c8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.310546 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.328660 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.344659 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kxs7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644501d5f60499ba8bae2b198471b512cdbd1f13eb1c2e628e88090f1118658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gccsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kxs7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.362972 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.376046 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.376100 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.376110 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.376128 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.376143 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:55Z","lastTransitionTime":"2026-03-18T15:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.380222 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.396451 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.415109 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.448819 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fea0d831c8304cb16b97fc1052c412edb0e7c94ad8f47e1a8db4b435f16d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5fea0d831c8304cb16b97fc1052c412edb0e7c94ad8f47e1a8db4b435f16d75\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:37:53Z\\\",\\\"message\\\":\\\"c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lqxgs_openshift-ovn-kubernetes(1dc15d44-2b63-40b8-b9c8-dad533d01710)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.466097 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e691f542-103a-4e95-966d-dbb5ab5ddaee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89b2ec36f37b8482847d2b8ed7cdd8510ee939015546708b4dd36ddbc28e2efc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4lmx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07232a46e61b298ee91534498283f5e0cf00c9fc6d8789b04c96d696807e3600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4lmx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cvzw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.480099 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.480144 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.480156 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.480177 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.480190 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:55Z","lastTransitionTime":"2026-03-18T15:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.493990 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.512151 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.527782 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.544044 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.582769 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.582819 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.582831 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.582850 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.582862 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:55Z","lastTransitionTime":"2026-03-18T15:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.686446 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.686502 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.686513 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.686552 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.686566 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:55Z","lastTransitionTime":"2026-03-18T15:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.751360 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/701f97fc-e026-4b52-ac03-e4bccbf34972-metrics-certs\") pod \"network-metrics-daemon-k88c8\" (UID: \"701f97fc-e026-4b52-ac03-e4bccbf34972\") " pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:37:55 crc kubenswrapper[4696]: E0318 15:37:55.751621 4696 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:37:55 crc kubenswrapper[4696]: E0318 15:37:55.751728 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/701f97fc-e026-4b52-ac03-e4bccbf34972-metrics-certs podName:701f97fc-e026-4b52-ac03-e4bccbf34972 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:56.751708649 +0000 UTC m=+119.757882855 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/701f97fc-e026-4b52-ac03-e4bccbf34972-metrics-certs") pod "network-metrics-daemon-k88c8" (UID: "701f97fc-e026-4b52-ac03-e4bccbf34972") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.790113 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.790164 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.790177 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.790195 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.790209 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:55Z","lastTransitionTime":"2026-03-18T15:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.894252 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.894341 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.894363 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.894391 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.894409 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:55Z","lastTransitionTime":"2026-03-18T15:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.997872 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.997930 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.997943 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.997964 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:55 crc kubenswrapper[4696]: I0318 15:37:55.997980 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:55Z","lastTransitionTime":"2026-03-18T15:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.101449 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.101555 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.101589 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.101623 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.101646 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:56Z","lastTransitionTime":"2026-03-18T15:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.204668 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.204727 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.204745 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.204775 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.204797 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:56Z","lastTransitionTime":"2026-03-18T15:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.309055 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.309120 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.309131 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.309150 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.309161 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:56Z","lastTransitionTime":"2026-03-18T15:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.412620 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.413035 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.413053 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.413086 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.413105 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:56Z","lastTransitionTime":"2026-03-18T15:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.516762 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.516826 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.516842 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.516865 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.516881 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:56Z","lastTransitionTime":"2026-03-18T15:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.597163 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.597268 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.597181 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.597368 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:56 crc kubenswrapper[4696]: E0318 15:37:56.597404 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:56 crc kubenswrapper[4696]: E0318 15:37:56.597480 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:56 crc kubenswrapper[4696]: E0318 15:37:56.597645 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:37:56 crc kubenswrapper[4696]: E0318 15:37:56.597772 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.619465 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.619515 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.619547 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.619575 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.619589 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:56Z","lastTransitionTime":"2026-03-18T15:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.723992 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.724036 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.724047 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.724061 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.724070 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:56Z","lastTransitionTime":"2026-03-18T15:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.765676 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/701f97fc-e026-4b52-ac03-e4bccbf34972-metrics-certs\") pod \"network-metrics-daemon-k88c8\" (UID: \"701f97fc-e026-4b52-ac03-e4bccbf34972\") " pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:37:56 crc kubenswrapper[4696]: E0318 15:37:56.765972 4696 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:37:56 crc kubenswrapper[4696]: E0318 15:37:56.766126 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/701f97fc-e026-4b52-ac03-e4bccbf34972-metrics-certs podName:701f97fc-e026-4b52-ac03-e4bccbf34972 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:58.766098199 +0000 UTC m=+121.772272405 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/701f97fc-e026-4b52-ac03-e4bccbf34972-metrics-certs") pod "network-metrics-daemon-k88c8" (UID: "701f97fc-e026-4b52-ac03-e4bccbf34972") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.826456 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.826538 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.826552 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.826570 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.826591 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:56Z","lastTransitionTime":"2026-03-18T15:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.930047 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.930089 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.930100 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.930118 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:56 crc kubenswrapper[4696]: I0318 15:37:56.930131 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:56Z","lastTransitionTime":"2026-03-18T15:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.033257 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.033317 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.033326 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.033355 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.033367 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:57Z","lastTransitionTime":"2026-03-18T15:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.136124 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.136162 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.136170 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.136183 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.136191 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:57Z","lastTransitionTime":"2026-03-18T15:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.238446 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.238478 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.238487 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.238500 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.238509 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:57Z","lastTransitionTime":"2026-03-18T15:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.340921 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.340983 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.340993 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.341007 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.341017 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:57Z","lastTransitionTime":"2026-03-18T15:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.444160 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.444208 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.444219 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.444235 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.444247 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:57Z","lastTransitionTime":"2026-03-18T15:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.546495 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.546550 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.546561 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.546578 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.546589 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:57Z","lastTransitionTime":"2026-03-18T15:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.615214 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.628730 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68a69dcd4ef157c15cdd54345e7a27def2a9e7f374d8053fd9f37c9b0cd120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.637949 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k88c8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"701f97fc-e026-4b52-ac03-e4bccbf34972\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89zn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89zn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k88c8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:57 crc kubenswrapper[4696]: E0318 15:37:57.647730 4696 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.649394 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.660379 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.673906 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.684410 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kxs7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644501d5f60499ba8bae2b198471b512cdbd1f13eb1c2e628e88090f1118658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gccsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kxs7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:57 crc kubenswrapper[4696]: E0318 15:37:57.694186 4696 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.711223 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.724376 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.739229 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.754484 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.776678 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fea0d831c8304cb16b97fc1052c412edb0e7c94ad8f47e1a8db4b435f16d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5fea0d831c8304cb16b97fc1052c412edb0e7c94ad8f47e1a8db4b435f16d75\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:37:53Z\\\",\\\"message\\\":\\\"c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lqxgs_openshift-ovn-kubernetes(1dc15d44-2b63-40b8-b9c8-dad533d01710)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.790650 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e691f542-103a-4e95-966d-dbb5ab5ddaee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89b2ec36f37b8482847d2b8ed7cdd8510ee939015546708b4dd36ddbc28e2efc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4lmx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07232a46e61b298ee91534498283f5e0cf00c9fc6d8789b04c96d696807e3600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4lmx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cvzw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.804906 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.815464 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:57 crc kubenswrapper[4696]: I0318 15:37:57.825225 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.421021 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.421084 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.421098 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.421121 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.421135 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:58Z","lastTransitionTime":"2026-03-18T15:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:58 crc kubenswrapper[4696]: E0318 15:37:58.436423 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.441992 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.442039 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.442058 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.442076 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.442087 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:58Z","lastTransitionTime":"2026-03-18T15:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:58 crc kubenswrapper[4696]: E0318 15:37:58.460891 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.471359 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.471474 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.471497 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.471544 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.471565 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:58Z","lastTransitionTime":"2026-03-18T15:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:58 crc kubenswrapper[4696]: E0318 15:37:58.495413 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.501291 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.501357 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.501376 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.501401 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.501420 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:58Z","lastTransitionTime":"2026-03-18T15:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:58 crc kubenswrapper[4696]: E0318 15:37:58.520216 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.526903 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.526954 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.526964 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.526986 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.526999 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:58Z","lastTransitionTime":"2026-03-18T15:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:58 crc kubenswrapper[4696]: E0318 15:37:58.548894 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:37:58 crc kubenswrapper[4696]: E0318 15:37:58.549134 4696 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.597147 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.597195 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.597209 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.597147 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:58 crc kubenswrapper[4696]: E0318 15:37:58.597367 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:58 crc kubenswrapper[4696]: E0318 15:37:58.597477 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:37:58 crc kubenswrapper[4696]: E0318 15:37:58.597695 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:58 crc kubenswrapper[4696]: E0318 15:37:58.597837 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:58 crc kubenswrapper[4696]: I0318 15:37:58.786797 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/701f97fc-e026-4b52-ac03-e4bccbf34972-metrics-certs\") pod \"network-metrics-daemon-k88c8\" (UID: \"701f97fc-e026-4b52-ac03-e4bccbf34972\") " pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:37:58 crc kubenswrapper[4696]: E0318 15:37:58.787129 4696 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:37:58 crc kubenswrapper[4696]: E0318 15:37:58.787271 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/701f97fc-e026-4b52-ac03-e4bccbf34972-metrics-certs podName:701f97fc-e026-4b52-ac03-e4bccbf34972 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:02.787236059 +0000 UTC m=+125.793410305 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/701f97fc-e026-4b52-ac03-e4bccbf34972-metrics-certs") pod "network-metrics-daemon-k88c8" (UID: "701f97fc-e026-4b52-ac03-e4bccbf34972") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:37:59 crc kubenswrapper[4696]: I0318 15:37:59.616291 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 18 15:38:00 crc kubenswrapper[4696]: I0318 15:38:00.597286 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:00 crc kubenswrapper[4696]: I0318 15:38:00.597294 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:00 crc kubenswrapper[4696]: E0318 15:38:00.597423 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:38:00 crc kubenswrapper[4696]: I0318 15:38:00.597537 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:00 crc kubenswrapper[4696]: E0318 15:38:00.597646 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:00 crc kubenswrapper[4696]: E0318 15:38:00.597704 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:00 crc kubenswrapper[4696]: I0318 15:38:00.598048 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:00 crc kubenswrapper[4696]: E0318 15:38:00.598244 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:02 crc kubenswrapper[4696]: I0318 15:38:02.596892 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:02 crc kubenswrapper[4696]: I0318 15:38:02.596915 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:02 crc kubenswrapper[4696]: I0318 15:38:02.596957 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:02 crc kubenswrapper[4696]: I0318 15:38:02.597012 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:02 crc kubenswrapper[4696]: E0318 15:38:02.597304 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:02 crc kubenswrapper[4696]: E0318 15:38:02.597398 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:38:02 crc kubenswrapper[4696]: E0318 15:38:02.597471 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:02 crc kubenswrapper[4696]: E0318 15:38:02.597596 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:02 crc kubenswrapper[4696]: E0318 15:38:02.695047 4696 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:38:02 crc kubenswrapper[4696]: I0318 15:38:02.826953 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/701f97fc-e026-4b52-ac03-e4bccbf34972-metrics-certs\") pod \"network-metrics-daemon-k88c8\" (UID: \"701f97fc-e026-4b52-ac03-e4bccbf34972\") " pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:02 crc kubenswrapper[4696]: E0318 15:38:02.827084 4696 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:38:02 crc kubenswrapper[4696]: E0318 15:38:02.827141 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/701f97fc-e026-4b52-ac03-e4bccbf34972-metrics-certs podName:701f97fc-e026-4b52-ac03-e4bccbf34972 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:10.827127147 +0000 UTC m=+133.833301353 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/701f97fc-e026-4b52-ac03-e4bccbf34972-metrics-certs") pod "network-metrics-daemon-k88c8" (UID: "701f97fc-e026-4b52-ac03-e4bccbf34972") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:38:04 crc kubenswrapper[4696]: I0318 15:38:04.039861 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:38:04 crc kubenswrapper[4696]: I0318 15:38:04.054794 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:04 crc kubenswrapper[4696]: I0318 15:38:04.069759 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68a69dcd4ef157c15cdd54345e7a27def2a9e7f374d8053fd9f37c9b0cd120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:04 crc kubenswrapper[4696]: I0318 15:38:04.083870 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k88c8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"701f97fc-e026-4b52-ac03-e4bccbf34972\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89zn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89zn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k88c8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:04 crc kubenswrapper[4696]: I0318 15:38:04.099038 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:04 crc kubenswrapper[4696]: I0318 15:38:04.113567 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:04 crc kubenswrapper[4696]: I0318 15:38:04.127141 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:04 crc kubenswrapper[4696]: I0318 15:38:04.139193 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kxs7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644501d5f60499ba8bae2b198471b512cdbd1f13eb1c2e628e88090f1118658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gccsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kxs7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:04 crc kubenswrapper[4696]: I0318 15:38:04.164698 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:04 crc kubenswrapper[4696]: I0318 15:38:04.180833 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:04 crc kubenswrapper[4696]: I0318 15:38:04.199651 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:04 crc kubenswrapper[4696]: I0318 15:38:04.224171 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:04 crc kubenswrapper[4696]: I0318 15:38:04.251976 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fea0d831c8304cb16b97fc1052c412edb0e7c94ad8f47e1a8db4b435f16d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5fea0d831c8304cb16b97fc1052c412edb0e7c94ad8f47e1a8db4b435f16d75\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:37:53Z\\\",\\\"message\\\":\\\"c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lqxgs_openshift-ovn-kubernetes(1dc15d44-2b63-40b8-b9c8-dad533d01710)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:04 crc kubenswrapper[4696]: I0318 15:38:04.269000 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e691f542-103a-4e95-966d-dbb5ab5ddaee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89b2ec36f37b8482847d2b8ed7cdd8510ee939015546708b4dd36ddbc28e2efc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4lmx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07232a46e61b298ee91534498283f5e0cf00c9fc6d8789b04c96d696807e3600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4lmx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cvzw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:04 crc kubenswrapper[4696]: I0318 15:38:04.284146 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9f2d4c6-fba6-4e58-9e36-a8737f18a409\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d50968d8853990e5727bc7800f1dc23ceaf69136e5dd91eb6baeb0493ce8723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5829b2c642f05cf5682b8e8cbe2079b16f6a4b046e773d51f36011ffcb50ce14\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:26Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:35:59.578389 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:35:59.583828 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:35:59.621207 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:35:59.624336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:36:26.346440 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:36:26.346512 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7b4da8be7e22b17201dd8593f7e1b3943344f19bef3957c76cb18f2c7427335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6be4af51a48cef4e53dec757f47fb7ed489c5c183cd4408791f9c1dfa2dc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eed26715af41fdb53a2c1eac925016c55579075ae588b5c591531d19b21f622\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:04 crc kubenswrapper[4696]: I0318 15:38:04.301458 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:04 crc kubenswrapper[4696]: I0318 15:38:04.313974 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:04 crc kubenswrapper[4696]: I0318 15:38:04.328337 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:04 crc kubenswrapper[4696]: I0318 15:38:04.597368 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:04 crc kubenswrapper[4696]: I0318 15:38:04.597431 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:04 crc kubenswrapper[4696]: I0318 15:38:04.597479 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:04 crc kubenswrapper[4696]: I0318 15:38:04.597391 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:04 crc kubenswrapper[4696]: E0318 15:38:04.597647 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:04 crc kubenswrapper[4696]: E0318 15:38:04.597810 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:38:04 crc kubenswrapper[4696]: E0318 15:38:04.597897 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:04 crc kubenswrapper[4696]: E0318 15:38:04.597987 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:06 crc kubenswrapper[4696]: I0318 15:38:06.597160 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:06 crc kubenswrapper[4696]: I0318 15:38:06.597200 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:06 crc kubenswrapper[4696]: E0318 15:38:06.597343 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:06 crc kubenswrapper[4696]: I0318 15:38:06.597349 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:06 crc kubenswrapper[4696]: I0318 15:38:06.597760 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:06 crc kubenswrapper[4696]: E0318 15:38:06.598016 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:06 crc kubenswrapper[4696]: E0318 15:38:06.598157 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:06 crc kubenswrapper[4696]: E0318 15:38:06.598007 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:38:06 crc kubenswrapper[4696]: I0318 15:38:06.598275 4696 scope.go:117] "RemoveContainer" containerID="d5fea0d831c8304cb16b97fc1052c412edb0e7c94ad8f47e1a8db4b435f16d75" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.198799 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lqxgs_1dc15d44-2b63-40b8-b9c8-dad533d01710/ovnkube-controller/1.log" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.202472 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" event={"ID":"1dc15d44-2b63-40b8-b9c8-dad533d01710","Type":"ContainerStarted","Data":"b0ed1a5792af0b624c478617b8f24a47b9144567df3a6e799394969b248847c5"} Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.203026 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.227912 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.248236 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.289074 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.312666 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9f2d4c6-fba6-4e58-9e36-a8737f18a409\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d50968d8853990e5727bc7800f1dc23ceaf69136e5dd91eb6baeb0493ce8723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5829b2c642f05cf5682b8e8cbe2079b16f6a4b046e773d51f36011ffcb50ce14\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:26Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:35:59.578389 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:35:59.583828 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:35:59.621207 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:35:59.624336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:36:26.346440 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:36:26.346512 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7b4da8be7e22b17201dd8593f7e1b3943344f19bef3957c76cb18f2c7427335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6be4af51a48cef4e53dec757f47fb7ed489c5c183cd4408791f9c1dfa2dc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eed26715af41fdb53a2c1eac925016c55579075ae588b5c591531d19b21f622\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.331589 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.349864 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68a69dcd4ef157c15cdd54345e7a27def2a9e7f374d8053fd9f37c9b0cd120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.362185 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k88c8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"701f97fc-e026-4b52-ac03-e4bccbf34972\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89zn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89zn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k88c8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.379425 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.395663 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.410342 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kxs7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644501d5f60499ba8bae2b198471b512cdbd1f13eb1c2e628e88090f1118658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gccsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kxs7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.427054 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.445501 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.458498 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.473902 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.492137 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ed1a5792af0b624c478617b8f24a47b9144567df3a6e799394969b248847c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5fea0d831c8304cb16b97fc1052c412edb0e7c94ad8f47e1a8db4b435f16d75\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:37:53Z\\\",\\\"message\\\":\\\"c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.507979 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e691f542-103a-4e95-966d-dbb5ab5ddaee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89b2ec36f37b8482847d2b8ed7cdd8510ee939015546708b4dd36ddbc28e2efc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4lmx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07232a46e61b298ee91534498283f5e0cf00c9fc6d8789b04c96d696807e3600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4lmx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cvzw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.534331 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.613452 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9f2d4c6-fba6-4e58-9e36-a8737f18a409\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d50968d8853990e5727bc7800f1dc23ceaf69136e5dd91eb6baeb0493ce8723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5829b2c642f05cf5682b8e8cbe2079b16f6a4b046e773d51f36011ffcb50ce14\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:26Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:35:59.578389 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:35:59.583828 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:35:59.621207 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:35:59.624336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:36:26.346440 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:36:26.346512 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7b4da8be7e22b17201dd8593f7e1b3943344f19bef3957c76cb18f2c7427335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6be4af51a48cef4e53dec757f47fb7ed489c5c183cd4408791f9c1dfa2dc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eed26715af41fdb53a2c1eac925016c55579075ae588b5c591531d19b21f622\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.630766 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.642734 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.657315 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.671586 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.687716 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68a69dcd4ef157c15cdd54345e7a27def2a9e7f374d8053fd9f37c9b0cd120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: E0318 15:38:07.696067 4696 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.711562 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k88c8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"701f97fc-e026-4b52-ac03-e4bccbf34972\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89zn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89zn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k88c8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.726201 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.740747 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.752408 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.762796 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kxs7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644501d5f60499ba8bae2b198471b512cdbd1f13eb1c2e628e88090f1118658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gccsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kxs7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.773930 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e691f542-103a-4e95-966d-dbb5ab5ddaee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89b2ec36f37b8482847d2b8ed7cdd8510ee939015546708b4dd36ddbc28e2efc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4lmx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07232a46e61b298ee91534498283f5e0cf00c9fc6d8789b04c96d696807e3600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4lmx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cvzw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.792341 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.806629 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.818409 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.830108 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:07 crc kubenswrapper[4696]: I0318 15:38:07.848764 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ed1a5792af0b624c478617b8f24a47b9144567df3a6e799394969b248847c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5fea0d831c8304cb16b97fc1052c412edb0e7c94ad8f47e1a8db4b435f16d75\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:37:53Z\\\",\\\"message\\\":\\\"c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.206757 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lqxgs_1dc15d44-2b63-40b8-b9c8-dad533d01710/ovnkube-controller/2.log" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.207203 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lqxgs_1dc15d44-2b63-40b8-b9c8-dad533d01710/ovnkube-controller/1.log" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.209614 4696 generic.go:334] "Generic (PLEG): container finished" podID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerID="b0ed1a5792af0b624c478617b8f24a47b9144567df3a6e799394969b248847c5" exitCode=1 Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.209671 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" event={"ID":"1dc15d44-2b63-40b8-b9c8-dad533d01710","Type":"ContainerDied","Data":"b0ed1a5792af0b624c478617b8f24a47b9144567df3a6e799394969b248847c5"} Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.209771 4696 scope.go:117] "RemoveContainer" containerID="d5fea0d831c8304cb16b97fc1052c412edb0e7c94ad8f47e1a8db4b435f16d75" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.210270 4696 scope.go:117] "RemoveContainer" containerID="b0ed1a5792af0b624c478617b8f24a47b9144567df3a6e799394969b248847c5" Mar 18 15:38:08 crc kubenswrapper[4696]: E0318 15:38:08.210423 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lqxgs_openshift-ovn-kubernetes(1dc15d44-2b63-40b8-b9c8-dad533d01710)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.222255 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.233680 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.242433 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kxs7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644501d5f60499ba8bae2b198471b512cdbd1f13eb1c2e628e88090f1118658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gccsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kxs7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.261917 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.279294 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.294875 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.310320 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.336753 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ed1a5792af0b624c478617b8f24a47b9144567df3a6e799394969b248847c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5fea0d831c8304cb16b97fc1052c412edb0e7c94ad8f47e1a8db4b435f16d75\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:37:53Z\\\",\\\"message\\\":\\\"c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0ed1a5792af0b624c478617b8f24a47b9144567df3a6e799394969b248847c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:38:07Z\\\",\\\"message\\\":\\\"ng(nil), Groups:[]string(nil)}}\\\\nI0318 15:38:07.535930 6979 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:38:07.535019 6979 lb_config.go:1031] Cluster endpoints for openshift-console-operator/metrics for network=default are: map[]\\\\nF0318 15:38:07.535688 6979 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.348055 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e691f542-103a-4e95-966d-dbb5ab5ddaee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89b2ec36f37b8482847d2b8ed7cdd8510ee939015546708b4dd36ddbc28e2efc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4lmx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07232a46e61b298ee91534498283f5e0cf00c9fc6d8789b04c96d696807e3600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4lmx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cvzw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.366556 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.380395 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.392761 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.405013 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.420693 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9f2d4c6-fba6-4e58-9e36-a8737f18a409\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d50968d8853990e5727bc7800f1dc23ceaf69136e5dd91eb6baeb0493ce8723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5829b2c642f05cf5682b8e8cbe2079b16f6a4b046e773d51f36011ffcb50ce14\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:26Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:35:59.578389 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:35:59.583828 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:35:59.621207 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:35:59.624336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:36:26.346440 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:36:26.346512 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7b4da8be7e22b17201dd8593f7e1b3943344f19bef3957c76cb18f2c7427335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6be4af51a48cef4e53dec757f47fb7ed489c5c183cd4408791f9c1dfa2dc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eed26715af41fdb53a2c1eac925016c55579075ae588b5c591531d19b21f622\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.434595 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.456688 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68a69dcd4ef157c15cdd54345e7a27def2a9e7f374d8053fd9f37c9b0cd120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.469784 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k88c8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"701f97fc-e026-4b52-ac03-e4bccbf34972\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89zn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89zn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k88c8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.569217 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.569282 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.569294 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.569312 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.569324 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:08Z","lastTransitionTime":"2026-03-18T15:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:08 crc kubenswrapper[4696]: E0318 15:38:08.589010 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.595682 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.595746 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.595766 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.595793 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.595814 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:08Z","lastTransitionTime":"2026-03-18T15:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.596623 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.596667 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.596679 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.596728 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:08 crc kubenswrapper[4696]: E0318 15:38:08.596820 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:08 crc kubenswrapper[4696]: E0318 15:38:08.597010 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:08 crc kubenswrapper[4696]: E0318 15:38:08.597132 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:38:08 crc kubenswrapper[4696]: E0318 15:38:08.597289 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:08 crc kubenswrapper[4696]: E0318 15:38:08.611233 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.617371 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.617454 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.617479 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.617508 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.617563 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:08Z","lastTransitionTime":"2026-03-18T15:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:08 crc kubenswrapper[4696]: E0318 15:38:08.635084 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.640008 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.640049 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.640064 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.640084 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.640100 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:08Z","lastTransitionTime":"2026-03-18T15:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:08 crc kubenswrapper[4696]: E0318 15:38:08.656913 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.662440 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.662474 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.662486 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.662501 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:08 crc kubenswrapper[4696]: I0318 15:38:08.662514 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:08Z","lastTransitionTime":"2026-03-18T15:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:08 crc kubenswrapper[4696]: E0318 15:38:08.677331 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:08 crc kubenswrapper[4696]: E0318 15:38:08.677466 4696 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:38:09 crc kubenswrapper[4696]: I0318 15:38:09.216964 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lqxgs_1dc15d44-2b63-40b8-b9c8-dad533d01710/ovnkube-controller/2.log" Mar 18 15:38:09 crc kubenswrapper[4696]: I0318 15:38:09.220747 4696 scope.go:117] "RemoveContainer" containerID="b0ed1a5792af0b624c478617b8f24a47b9144567df3a6e799394969b248847c5" Mar 18 15:38:09 crc kubenswrapper[4696]: E0318 15:38:09.220905 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lqxgs_openshift-ovn-kubernetes(1dc15d44-2b63-40b8-b9c8-dad533d01710)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" Mar 18 15:38:09 crc kubenswrapper[4696]: I0318 15:38:09.235367 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kxs7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644501d5f60499ba8bae2b198471b512cdbd1f13eb1c2e628e88090f1118658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gccsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kxs7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:09Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:09 crc kubenswrapper[4696]: I0318 15:38:09.252580 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:09Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:09 crc kubenswrapper[4696]: I0318 15:38:09.266452 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:09Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:09 crc kubenswrapper[4696]: I0318 15:38:09.286559 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:09Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:09 crc kubenswrapper[4696]: I0318 15:38:09.308771 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:09Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:09 crc kubenswrapper[4696]: I0318 15:38:09.332422 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ed1a5792af0b624c478617b8f24a47b9144567df3a6e799394969b248847c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0ed1a5792af0b624c478617b8f24a47b9144567df3a6e799394969b248847c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:38:07Z\\\",\\\"message\\\":\\\"ng(nil), Groups:[]string(nil)}}\\\\nI0318 15:38:07.535930 6979 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:38:07.535019 6979 lb_config.go:1031] Cluster endpoints for openshift-console-operator/metrics for network=default are: map[]\\\\nF0318 15:38:07.535688 6979 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lqxgs_openshift-ovn-kubernetes(1dc15d44-2b63-40b8-b9c8-dad533d01710)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:09Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:09 crc kubenswrapper[4696]: I0318 15:38:09.347988 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e691f542-103a-4e95-966d-dbb5ab5ddaee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89b2ec36f37b8482847d2b8ed7cdd8510ee939015546708b4dd36ddbc28e2efc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4lmx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07232a46e61b298ee91534498283f5e0cf00c9fc6d8789b04c96d696807e3600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4lmx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cvzw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:09Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:09 crc kubenswrapper[4696]: I0318 15:38:09.369435 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:09Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:09 crc kubenswrapper[4696]: I0318 15:38:09.384933 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:09Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:09 crc kubenswrapper[4696]: I0318 15:38:09.396368 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:09Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:09 crc kubenswrapper[4696]: I0318 15:38:09.407452 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:09Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:09 crc kubenswrapper[4696]: I0318 15:38:09.417472 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:09Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:09 crc kubenswrapper[4696]: I0318 15:38:09.429979 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9f2d4c6-fba6-4e58-9e36-a8737f18a409\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d50968d8853990e5727bc7800f1dc23ceaf69136e5dd91eb6baeb0493ce8723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5829b2c642f05cf5682b8e8cbe2079b16f6a4b046e773d51f36011ffcb50ce14\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:26Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:35:59.578389 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:35:59.583828 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:35:59.621207 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:35:59.624336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:36:26.346440 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:36:26.346512 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7b4da8be7e22b17201dd8593f7e1b3943344f19bef3957c76cb18f2c7427335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6be4af51a48cef4e53dec757f47fb7ed489c5c183cd4408791f9c1dfa2dc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eed26715af41fdb53a2c1eac925016c55579075ae588b5c591531d19b21f622\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:09Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:09 crc kubenswrapper[4696]: I0318 15:38:09.441791 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:09Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:09 crc kubenswrapper[4696]: I0318 15:38:09.454186 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k88c8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"701f97fc-e026-4b52-ac03-e4bccbf34972\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89zn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89zn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k88c8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:09Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:09 crc kubenswrapper[4696]: I0318 15:38:09.475496 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:09Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:09 crc kubenswrapper[4696]: I0318 15:38:09.506463 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68a69dcd4ef157c15cdd54345e7a27def2a9e7f374d8053fd9f37c9b0cd120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:09Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:10 crc kubenswrapper[4696]: I0318 15:38:10.596599 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:10 crc kubenswrapper[4696]: I0318 15:38:10.596693 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:10 crc kubenswrapper[4696]: I0318 15:38:10.596720 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:10 crc kubenswrapper[4696]: E0318 15:38:10.597032 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:38:10 crc kubenswrapper[4696]: I0318 15:38:10.597076 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:10 crc kubenswrapper[4696]: E0318 15:38:10.597223 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:10 crc kubenswrapper[4696]: E0318 15:38:10.597364 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:10 crc kubenswrapper[4696]: E0318 15:38:10.597713 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:10 crc kubenswrapper[4696]: I0318 15:38:10.608707 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 18 15:38:10 crc kubenswrapper[4696]: I0318 15:38:10.918747 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/701f97fc-e026-4b52-ac03-e4bccbf34972-metrics-certs\") pod \"network-metrics-daemon-k88c8\" (UID: \"701f97fc-e026-4b52-ac03-e4bccbf34972\") " pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:10 crc kubenswrapper[4696]: E0318 15:38:10.919050 4696 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:38:10 crc kubenswrapper[4696]: E0318 15:38:10.919223 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/701f97fc-e026-4b52-ac03-e4bccbf34972-metrics-certs podName:701f97fc-e026-4b52-ac03-e4bccbf34972 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:26.919188932 +0000 UTC m=+149.925363178 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/701f97fc-e026-4b52-ac03-e4bccbf34972-metrics-certs") pod "network-metrics-daemon-k88c8" (UID: "701f97fc-e026-4b52-ac03-e4bccbf34972") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:38:12 crc kubenswrapper[4696]: I0318 15:38:12.597174 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:12 crc kubenswrapper[4696]: I0318 15:38:12.597246 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:12 crc kubenswrapper[4696]: I0318 15:38:12.597174 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:12 crc kubenswrapper[4696]: E0318 15:38:12.597381 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:12 crc kubenswrapper[4696]: I0318 15:38:12.597196 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:12 crc kubenswrapper[4696]: E0318 15:38:12.597314 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:12 crc kubenswrapper[4696]: E0318 15:38:12.597506 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:38:12 crc kubenswrapper[4696]: E0318 15:38:12.597566 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:12 crc kubenswrapper[4696]: E0318 15:38:12.697307 4696 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:38:13 crc kubenswrapper[4696]: I0318 15:38:13.613868 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 18 15:38:14 crc kubenswrapper[4696]: I0318 15:38:14.596608 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:14 crc kubenswrapper[4696]: I0318 15:38:14.596664 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:14 crc kubenswrapper[4696]: I0318 15:38:14.596721 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:14 crc kubenswrapper[4696]: E0318 15:38:14.596880 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:38:14 crc kubenswrapper[4696]: E0318 15:38:14.597032 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:14 crc kubenswrapper[4696]: E0318 15:38:14.597168 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:14 crc kubenswrapper[4696]: I0318 15:38:14.597253 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:14 crc kubenswrapper[4696]: E0318 15:38:14.597363 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:16 crc kubenswrapper[4696]: I0318 15:38:16.597101 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:16 crc kubenswrapper[4696]: I0318 15:38:16.597193 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:16 crc kubenswrapper[4696]: I0318 15:38:16.597233 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:16 crc kubenswrapper[4696]: I0318 15:38:16.597151 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:16 crc kubenswrapper[4696]: E0318 15:38:16.597325 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:16 crc kubenswrapper[4696]: E0318 15:38:16.597445 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:38:16 crc kubenswrapper[4696]: E0318 15:38:16.597572 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:16 crc kubenswrapper[4696]: E0318 15:38:16.597637 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:17 crc kubenswrapper[4696]: I0318 15:38:17.612481 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5f2a34-67d8-42ae-aeb2-14df853224fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ad97f0893d14187ba6a594d37301ce91193e41d97cba600a26acde8bf16746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75378392c29b28f1271ad32f90c030a02e20efaeb9f4e33b462c445d31427213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c1df36f7f5aa9da678a3b7317fd6caeabd0b8abacf0f95d6af40b599e978e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a4f2ab0c9da67f71a0761016cf09d58cdc97abb4f88197165e533cd4346c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4f2ab0c9da67f71a0761016cf09d58cdc97abb4f88197165e533cd4346c199\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:17 crc kubenswrapper[4696]: I0318 15:38:17.629645 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:17 crc kubenswrapper[4696]: I0318 15:38:17.646697 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e865105-7459-4de0-ade2-9bac1ff5f094\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68a69dcd4ef157c15cdd54345e7a27def2a9e7f374d8053fd9f37c9b0cd120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb64413ba4adfb96d012b7766189a651dc26b523201595d2a1566e6aa3da5256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50bf6d182cb4ee5f1a76965fab9c408218039bb7c7d6e4d0a7fae71cc9ba59ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da5c268bbcbed48ac7f2c5987abb971a0e85419967ec7aefbeac916a5d8ee89b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e86a4ad8757b98411a3327384c30a0442bd604225b5367905f93b45947f9e86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58776809ab3caa42b02c68212fc042e6fd7460e50fff31143355feffbb26f213\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642bc3454fd30b4ffdc397fe6ca9f94d323fa0b7925c11bba1058bffe5069139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcnmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7nz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:17 crc kubenswrapper[4696]: I0318 15:38:17.663445 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k88c8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"701f97fc-e026-4b52-ac03-e4bccbf34972\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89zn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89zn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k88c8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:17 crc kubenswrapper[4696]: I0318 15:38:17.679793 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5dc220644025ef7f43b672d274cfd1c6c8eae619ddb61e03c74e549fccedb91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:17 crc kubenswrapper[4696]: I0318 15:38:17.697039 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743aa48253f853a27dd116d197347935743d3ce83cf19503d3f6bf54ce19121d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:17 crc kubenswrapper[4696]: E0318 15:38:17.697771 4696 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:38:17 crc kubenswrapper[4696]: I0318 15:38:17.714751 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:17 crc kubenswrapper[4696]: I0318 15:38:17.727212 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kxs7v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dd9d1ab-5bb8-4005-bf13-be7a7620e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644501d5f60499ba8bae2b198471b512cdbd1f13eb1c2e628e88090f1118658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gccsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kxs7v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:17 crc kubenswrapper[4696]: I0318 15:38:17.755741 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:17 crc kubenswrapper[4696]: I0318 15:38:17.770752 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:17 crc kubenswrapper[4696]: I0318 15:38:17.781938 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70bb9c76f0db86fc55d28fb819e0e7e01ad5241c7872c24076a77ae2b5eb4cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d9e43cb44bfffc3b1bb6fdfd81f0f60079a77d85c03a2b7ea59f4f6e1c9dd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:17 crc kubenswrapper[4696]: I0318 15:38:17.795672 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w9dbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49424478-cad5-4788-b01e-4ebde47480e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k66fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w9dbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:17 crc kubenswrapper[4696]: I0318 15:38:17.814704 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc15d44-2b63-40b8-b9c8-dad533d01710\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ed1a5792af0b624c478617b8f24a47b9144567df3a6e799394969b248847c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0ed1a5792af0b624c478617b8f24a47b9144567df3a6e799394969b248847c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:38:07Z\\\",\\\"message\\\":\\\"ng(nil), Groups:[]string(nil)}}\\\\nI0318 15:38:07.535930 6979 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:38:07.535019 6979 lb_config.go:1031] Cluster endpoints for openshift-console-operator/metrics for network=default are: map[]\\\\nF0318 15:38:07.535688 6979 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lqxgs_openshift-ovn-kubernetes(1dc15d44-2b63-40b8-b9c8-dad533d01710)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vvzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lqxgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:17 crc kubenswrapper[4696]: I0318 15:38:17.829087 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e691f542-103a-4e95-966d-dbb5ab5ddaee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89b2ec36f37b8482847d2b8ed7cdd8510ee939015546708b4dd36ddbc28e2efc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4lmx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07232a46e61b298ee91534498283f5e0cf00c9fc6d8789b04c96d696807e3600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4lmx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cvzw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:17 crc kubenswrapper[4696]: I0318 15:38:17.843378 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9f2d4c6-fba6-4e58-9e36-a8737f18a409\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d50968d8853990e5727bc7800f1dc23ceaf69136e5dd91eb6baeb0493ce8723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5829b2c642f05cf5682b8e8cbe2079b16f6a4b046e773d51f36011ffcb50ce14\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:26Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:35:59.578389 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:35:59.583828 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:35:59.621207 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:35:59.624336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:36:26.346440 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:36:26.346512 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7b4da8be7e22b17201dd8593f7e1b3943344f19bef3957c76cb18f2c7427335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c6be4af51a48cef4e53dec757f47fb7ed489c5c183cd4408791f9c1dfa2dc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eed26715af41fdb53a2c1eac925016c55579075ae588b5c591531d19b21f622\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:17 crc kubenswrapper[4696]: I0318 15:38:17.856982 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acd3d8cc-2d2e-48ac-80d6-f00dcc2a6ef4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://332711dc63c9bf3d6c8a9cbba45bb6085d89f15b1c238e5a3a57e381f6804159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://883ca3ee77443187b0bed76a9fb774a3afa002fa4cf735ad4e5c069b16dc7de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://883ca3ee77443187b0bed76a9fb774a3afa002fa4cf735ad4e5c069b16dc7de1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:17 crc kubenswrapper[4696]: I0318 15:38:17.869672 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:17 crc kubenswrapper[4696]: I0318 15:38:17.881287 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8l8zp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d76f08-84b8-44cb-b179-ebc9bc26a8b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a73dd9dc92287aa6bf96157a0f90fe126e76ad38a181407ddc312b84c4fccaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8l8zp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:17 crc kubenswrapper[4696]: I0318 15:38:17.893798 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d74b6f45-9bfc-4439-b43b-03f441c544fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f142e0a5f2fb8b65e41ae390ad381b52a149f74409175babd801cb5e8735ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjv48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.597507 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.597606 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.597562 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.597742 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:18 crc kubenswrapper[4696]: E0318 15:38:18.597922 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:18 crc kubenswrapper[4696]: E0318 15:38:18.598157 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:18 crc kubenswrapper[4696]: E0318 15:38:18.598357 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:38:18 crc kubenswrapper[4696]: E0318 15:38:18.598604 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.769225 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.770256 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.770393 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.770569 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.770772 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:18Z","lastTransitionTime":"2026-03-18T15:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:18 crc kubenswrapper[4696]: E0318 15:38:18.788707 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.794768 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.794836 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.794854 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.794875 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.794888 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:18Z","lastTransitionTime":"2026-03-18T15:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:18 crc kubenswrapper[4696]: E0318 15:38:18.815869 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.821241 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.821320 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.821333 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.821356 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.821373 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:18Z","lastTransitionTime":"2026-03-18T15:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:18 crc kubenswrapper[4696]: E0318 15:38:18.841779 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.846716 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.846805 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.846825 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.846856 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.846874 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:18Z","lastTransitionTime":"2026-03-18T15:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:18 crc kubenswrapper[4696]: E0318 15:38:18.864000 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.869321 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.869375 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.869390 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.869413 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:18 crc kubenswrapper[4696]: I0318 15:38:18.869431 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:18Z","lastTransitionTime":"2026-03-18T15:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:18 crc kubenswrapper[4696]: E0318 15:38:18.885269 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"33442fad-71cc-47a2-b717-94dce6899c46\\\",\\\"systemUUID\\\":\\\"a02dd351-206a-4946-acba-446bc8ebd92d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:18 crc kubenswrapper[4696]: E0318 15:38:18.885557 4696 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:38:20 crc kubenswrapper[4696]: I0318 15:38:20.597296 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:20 crc kubenswrapper[4696]: I0318 15:38:20.597367 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:20 crc kubenswrapper[4696]: I0318 15:38:20.597296 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:20 crc kubenswrapper[4696]: I0318 15:38:20.597474 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:20 crc kubenswrapper[4696]: E0318 15:38:20.597635 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:38:20 crc kubenswrapper[4696]: E0318 15:38:20.597498 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:20 crc kubenswrapper[4696]: E0318 15:38:20.597774 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:20 crc kubenswrapper[4696]: E0318 15:38:20.597898 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:20 crc kubenswrapper[4696]: I0318 15:38:20.598922 4696 scope.go:117] "RemoveContainer" containerID="b0ed1a5792af0b624c478617b8f24a47b9144567df3a6e799394969b248847c5" Mar 18 15:38:20 crc kubenswrapper[4696]: E0318 15:38:20.599164 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lqxgs_openshift-ovn-kubernetes(1dc15d44-2b63-40b8-b9c8-dad533d01710)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" Mar 18 15:38:20 crc kubenswrapper[4696]: I0318 15:38:20.839145 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:20 crc kubenswrapper[4696]: E0318 15:38:20.839555 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:39:24.839452908 +0000 UTC m=+207.845627154 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:20 crc kubenswrapper[4696]: I0318 15:38:20.839634 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:20 crc kubenswrapper[4696]: I0318 15:38:20.839706 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:20 crc kubenswrapper[4696]: E0318 15:38:20.839829 4696 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:38:20 crc kubenswrapper[4696]: E0318 15:38:20.839905 4696 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:38:20 crc kubenswrapper[4696]: E0318 15:38:20.839944 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:39:24.83991929 +0000 UTC m=+207.846093666 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:38:20 crc kubenswrapper[4696]: E0318 15:38:20.839984 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:39:24.839965421 +0000 UTC m=+207.846139877 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:38:20 crc kubenswrapper[4696]: I0318 15:38:20.941948 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:20 crc kubenswrapper[4696]: I0318 15:38:20.942198 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:20 crc kubenswrapper[4696]: E0318 15:38:20.942302 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:38:20 crc kubenswrapper[4696]: E0318 15:38:20.942351 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:38:20 crc kubenswrapper[4696]: E0318 15:38:20.942380 4696 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:20 crc kubenswrapper[4696]: E0318 15:38:20.942500 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:39:24.942448969 +0000 UTC m=+207.948623225 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:20 crc kubenswrapper[4696]: E0318 15:38:20.942571 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:38:20 crc kubenswrapper[4696]: E0318 15:38:20.942607 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:38:20 crc kubenswrapper[4696]: E0318 15:38:20.942629 4696 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:20 crc kubenswrapper[4696]: E0318 15:38:20.942706 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:39:24.942680935 +0000 UTC m=+207.948855191 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:22 crc kubenswrapper[4696]: I0318 15:38:22.597489 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:22 crc kubenswrapper[4696]: E0318 15:38:22.597784 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:22 crc kubenswrapper[4696]: I0318 15:38:22.597817 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:22 crc kubenswrapper[4696]: I0318 15:38:22.597863 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:22 crc kubenswrapper[4696]: I0318 15:38:22.597817 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:22 crc kubenswrapper[4696]: E0318 15:38:22.597978 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:22 crc kubenswrapper[4696]: E0318 15:38:22.598108 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:38:22 crc kubenswrapper[4696]: E0318 15:38:22.598252 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:22 crc kubenswrapper[4696]: E0318 15:38:22.699378 4696 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:38:24 crc kubenswrapper[4696]: I0318 15:38:24.597290 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:24 crc kubenswrapper[4696]: I0318 15:38:24.597366 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:24 crc kubenswrapper[4696]: I0318 15:38:24.597324 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:24 crc kubenswrapper[4696]: E0318 15:38:24.597545 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:24 crc kubenswrapper[4696]: I0318 15:38:24.597460 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:24 crc kubenswrapper[4696]: E0318 15:38:24.597776 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:24 crc kubenswrapper[4696]: E0318 15:38:24.597918 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:38:24 crc kubenswrapper[4696]: E0318 15:38:24.598063 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:26 crc kubenswrapper[4696]: I0318 15:38:26.597282 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:26 crc kubenswrapper[4696]: I0318 15:38:26.597308 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:26 crc kubenswrapper[4696]: E0318 15:38:26.597422 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:26 crc kubenswrapper[4696]: I0318 15:38:26.597461 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:26 crc kubenswrapper[4696]: E0318 15:38:26.597602 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:26 crc kubenswrapper[4696]: I0318 15:38:26.597634 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:26 crc kubenswrapper[4696]: E0318 15:38:26.597691 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:26 crc kubenswrapper[4696]: E0318 15:38:26.597755 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:38:26 crc kubenswrapper[4696]: I0318 15:38:26.931311 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/701f97fc-e026-4b52-ac03-e4bccbf34972-metrics-certs\") pod \"network-metrics-daemon-k88c8\" (UID: \"701f97fc-e026-4b52-ac03-e4bccbf34972\") " pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:26 crc kubenswrapper[4696]: E0318 15:38:26.931548 4696 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:38:26 crc kubenswrapper[4696]: E0318 15:38:26.931652 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/701f97fc-e026-4b52-ac03-e4bccbf34972-metrics-certs podName:701f97fc-e026-4b52-ac03-e4bccbf34972 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:58.931626852 +0000 UTC m=+181.937801078 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/701f97fc-e026-4b52-ac03-e4bccbf34972-metrics-certs") pod "network-metrics-daemon-k88c8" (UID: "701f97fc-e026-4b52-ac03-e4bccbf34972") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:38:27 crc kubenswrapper[4696]: I0318 15:38:27.620103 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8f1c577-8bd3-48a0-a969-5651d3ac6722\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62aec54598a49d7e720be8bb535c20604dd57bca4bb75a5a63a8696a1a33eacc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6c0ea1c6e5965f7ae13d4ed3c4ee6f756d1ec7080c3ca210bc390e118a8fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30ea805562626b47da064adb8f783d2a0a2045c0d15e3d31b31f5495a8946aa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3101b0e23ea541ebd60851efe67b945d2c84e972ce48ce613a0d710d91b7d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de754da7598efab4ed2ed49c14918709f43c7a58e87457c7797da2a064e974b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://405c59c68d9f0b6284adf2bce4dbf84a06a6102df3ac58f1a4e8cb78c1f50d5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0278770a3e278dbbf28c51579177a91040c7c11916b4613b6c056e6f57a680c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14bcd39c827e35b9bd59426d7d090134511ebeb1ef0a95dac86a3ce444d8f633\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:27 crc kubenswrapper[4696]: I0318 15:38:27.640107 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64ea155e-f1fc-4919-94e1-249625f0fefb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:36:57Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 15:36:57.250346 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:36:57.250655 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:36:57.251586 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-505452562/tls.crt::/tmp/serving-cert-505452562/tls.key\\\\\\\"\\\\nI0318 15:36:57.618430 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:36:57.620095 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:36:57.620113 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:36:57.620136 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:36:57.620141 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:36:57.624870 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:36:57.624938 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624950 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:36:57.624961 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:36:57.624971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:36:57.624980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:36:57.624988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 15:36:57.624879 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 15:36:57.627087 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:27 crc kubenswrapper[4696]: E0318 15:38:27.700032 4696 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:38:27 crc kubenswrapper[4696]: I0318 15:38:27.705502 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-w9dbn" podStartSLOduration=103.705489628 podStartE2EDuration="1m43.705489628s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:27.704953404 +0000 UTC m=+150.711127610" watchObservedRunningTime="2026-03-18 15:38:27.705489628 +0000 UTC m=+150.711663834" Mar 18 15:38:27 crc kubenswrapper[4696]: I0318 15:38:27.748949 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cvzw9" podStartSLOduration=102.748930411 podStartE2EDuration="1m42.748930411s" podCreationTimestamp="2026-03-18 15:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:27.748417698 +0000 UTC m=+150.754591914" watchObservedRunningTime="2026-03-18 15:38:27.748930411 +0000 UTC m=+150.755104617" Mar 18 15:38:27 crc kubenswrapper[4696]: I0318 15:38:27.779799 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=28.779779367 podStartE2EDuration="28.779779367s" podCreationTimestamp="2026-03-18 15:37:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:27.769794046 +0000 UTC m=+150.775968252" watchObservedRunningTime="2026-03-18 15:38:27.779779367 +0000 UTC m=+150.785953573" Mar 18 15:38:27 crc kubenswrapper[4696]: I0318 15:38:27.792839 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=14.792814235 podStartE2EDuration="14.792814235s" podCreationTimestamp="2026-03-18 15:38:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:27.779971492 +0000 UTC m=+150.786145738" watchObservedRunningTime="2026-03-18 15:38:27.792814235 +0000 UTC m=+150.798988441" Mar 18 15:38:27 crc kubenswrapper[4696]: I0318 15:38:27.806963 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8l8zp" podStartSLOduration=103.806941231 podStartE2EDuration="1m43.806941231s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:27.806879969 +0000 UTC m=+150.813054205" watchObservedRunningTime="2026-03-18 15:38:27.806941231 +0000 UTC m=+150.813115437" Mar 18 15:38:27 crc kubenswrapper[4696]: I0318 15:38:27.820157 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podStartSLOduration=103.820136783 podStartE2EDuration="1m43.820136783s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:27.81960475 +0000 UTC m=+150.825778956" watchObservedRunningTime="2026-03-18 15:38:27.820136783 +0000 UTC m=+150.826310989" Mar 18 15:38:27 crc kubenswrapper[4696]: I0318 15:38:27.832749 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=17.83273238 podStartE2EDuration="17.83273238s" podCreationTimestamp="2026-03-18 15:38:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:27.831872578 +0000 UTC m=+150.838046794" watchObservedRunningTime="2026-03-18 15:38:27.83273238 +0000 UTC m=+150.838906576" Mar 18 15:38:27 crc kubenswrapper[4696]: I0318 15:38:27.873502 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-c7nz9" podStartSLOduration=103.873482066 podStartE2EDuration="1m43.873482066s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:27.862724435 +0000 UTC m=+150.868898671" watchObservedRunningTime="2026-03-18 15:38:27.873482066 +0000 UTC m=+150.879656272" Mar 18 15:38:28 crc kubenswrapper[4696]: I0318 15:38:28.597103 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:28 crc kubenswrapper[4696]: I0318 15:38:28.597167 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:28 crc kubenswrapper[4696]: I0318 15:38:28.597237 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:28 crc kubenswrapper[4696]: E0318 15:38:28.597379 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:28 crc kubenswrapper[4696]: E0318 15:38:28.597638 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:38:28 crc kubenswrapper[4696]: E0318 15:38:28.597658 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:28 crc kubenswrapper[4696]: I0318 15:38:28.597740 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:28 crc kubenswrapper[4696]: E0318 15:38:28.597840 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.172740 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.172782 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.172794 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.172841 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.172854 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:29Z","lastTransitionTime":"2026-03-18T15:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.224084 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kxs7v" podStartSLOduration=105.224059105 podStartE2EDuration="1m45.224059105s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:27.931094806 +0000 UTC m=+150.937269012" watchObservedRunningTime="2026-03-18 15:38:29.224059105 +0000 UTC m=+152.230233331" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.225759 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqgns"] Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.226303 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqgns" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.229858 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.229958 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.230089 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.230096 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.251837 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef96c49e-48fe-4335-badd-e53836a8fea6-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wqgns\" (UID: \"ef96c49e-48fe-4335-badd-e53836a8fea6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqgns" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.251889 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef96c49e-48fe-4335-badd-e53836a8fea6-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wqgns\" (UID: \"ef96c49e-48fe-4335-badd-e53836a8fea6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqgns" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.251942 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ef96c49e-48fe-4335-badd-e53836a8fea6-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wqgns\" (UID: \"ef96c49e-48fe-4335-badd-e53836a8fea6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqgns" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.252010 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ef96c49e-48fe-4335-badd-e53836a8fea6-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wqgns\" (UID: \"ef96c49e-48fe-4335-badd-e53836a8fea6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqgns" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.252058 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ef96c49e-48fe-4335-badd-e53836a8fea6-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wqgns\" (UID: \"ef96c49e-48fe-4335-badd-e53836a8fea6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqgns" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.254577 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=59.254552693 podStartE2EDuration="59.254552693s" podCreationTimestamp="2026-03-18 15:37:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:29.25325878 +0000 UTC m=+152.259433026" watchObservedRunningTime="2026-03-18 15:38:29.254552693 +0000 UTC m=+152.260726919" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.271827 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=70.271807677 podStartE2EDuration="1m10.271807677s" podCreationTimestamp="2026-03-18 15:37:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:29.271485769 +0000 UTC m=+152.277659995" watchObservedRunningTime="2026-03-18 15:38:29.271807677 +0000 UTC m=+152.277981883" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.298790 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w9dbn_49424478-cad5-4788-b01e-4ebde47480e1/kube-multus/0.log" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.298837 4696 generic.go:334] "Generic (PLEG): container finished" podID="49424478-cad5-4788-b01e-4ebde47480e1" containerID="684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049" exitCode=1 Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.298869 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w9dbn" event={"ID":"49424478-cad5-4788-b01e-4ebde47480e1","Type":"ContainerDied","Data":"684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049"} Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.299241 4696 scope.go:117] "RemoveContainer" containerID="684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.352999 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef96c49e-48fe-4335-badd-e53836a8fea6-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wqgns\" (UID: \"ef96c49e-48fe-4335-badd-e53836a8fea6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqgns" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.353396 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef96c49e-48fe-4335-badd-e53836a8fea6-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wqgns\" (UID: \"ef96c49e-48fe-4335-badd-e53836a8fea6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqgns" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.353468 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ef96c49e-48fe-4335-badd-e53836a8fea6-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wqgns\" (UID: \"ef96c49e-48fe-4335-badd-e53836a8fea6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqgns" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.353546 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ef96c49e-48fe-4335-badd-e53836a8fea6-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wqgns\" (UID: \"ef96c49e-48fe-4335-badd-e53836a8fea6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqgns" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.353618 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ef96c49e-48fe-4335-badd-e53836a8fea6-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wqgns\" (UID: \"ef96c49e-48fe-4335-badd-e53836a8fea6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqgns" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.353687 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ef96c49e-48fe-4335-badd-e53836a8fea6-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wqgns\" (UID: \"ef96c49e-48fe-4335-badd-e53836a8fea6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqgns" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.353713 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ef96c49e-48fe-4335-badd-e53836a8fea6-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wqgns\" (UID: \"ef96c49e-48fe-4335-badd-e53836a8fea6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqgns" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.353865 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef96c49e-48fe-4335-badd-e53836a8fea6-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wqgns\" (UID: \"ef96c49e-48fe-4335-badd-e53836a8fea6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqgns" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.360708 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef96c49e-48fe-4335-badd-e53836a8fea6-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wqgns\" (UID: \"ef96c49e-48fe-4335-badd-e53836a8fea6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqgns" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.378424 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ef96c49e-48fe-4335-badd-e53836a8fea6-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wqgns\" (UID: \"ef96c49e-48fe-4335-badd-e53836a8fea6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqgns" Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.538590 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqgns" Mar 18 15:38:29 crc kubenswrapper[4696]: W0318 15:38:29.551909 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef96c49e_48fe_4335_badd_e53836a8fea6.slice/crio-cb52f73ae5a190569bdc8ce60e4ca836f44e120cc9a93e1cb4f31279d6a6bd01 WatchSource:0}: Error finding container cb52f73ae5a190569bdc8ce60e4ca836f44e120cc9a93e1cb4f31279d6a6bd01: Status 404 returned error can't find the container with id cb52f73ae5a190569bdc8ce60e4ca836f44e120cc9a93e1cb4f31279d6a6bd01 Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.935143 4696 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 18 15:38:29 crc kubenswrapper[4696]: I0318 15:38:29.942303 4696 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 15:38:30 crc kubenswrapper[4696]: I0318 15:38:30.303818 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w9dbn_49424478-cad5-4788-b01e-4ebde47480e1/kube-multus/0.log" Mar 18 15:38:30 crc kubenswrapper[4696]: I0318 15:38:30.303948 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w9dbn" event={"ID":"49424478-cad5-4788-b01e-4ebde47480e1","Type":"ContainerStarted","Data":"fa794b83238daf1378a5166ca37e2d751e1a3333d97dd591aa631c9bf3e539b5"} Mar 18 15:38:30 crc kubenswrapper[4696]: I0318 15:38:30.305678 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqgns" event={"ID":"ef96c49e-48fe-4335-badd-e53836a8fea6","Type":"ContainerStarted","Data":"005eaab7446624410210dedaa63cab03d82e1a8c7e5facbb17db4547cdc8a6b7"} Mar 18 15:38:30 crc kubenswrapper[4696]: I0318 15:38:30.305715 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqgns" event={"ID":"ef96c49e-48fe-4335-badd-e53836a8fea6","Type":"ContainerStarted","Data":"cb52f73ae5a190569bdc8ce60e4ca836f44e120cc9a93e1cb4f31279d6a6bd01"} Mar 18 15:38:30 crc kubenswrapper[4696]: I0318 15:38:30.340301 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqgns" podStartSLOduration=106.340280568 podStartE2EDuration="1m46.340280568s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:30.338575135 +0000 UTC m=+153.344749361" watchObservedRunningTime="2026-03-18 15:38:30.340280568 +0000 UTC m=+153.346454764" Mar 18 15:38:30 crc kubenswrapper[4696]: I0318 15:38:30.597154 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:30 crc kubenswrapper[4696]: I0318 15:38:30.597248 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:30 crc kubenswrapper[4696]: I0318 15:38:30.597259 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:30 crc kubenswrapper[4696]: E0318 15:38:30.597476 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:30 crc kubenswrapper[4696]: E0318 15:38:30.597289 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:30 crc kubenswrapper[4696]: I0318 15:38:30.597299 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:30 crc kubenswrapper[4696]: E0318 15:38:30.597576 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:30 crc kubenswrapper[4696]: E0318 15:38:30.597660 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:38:32 crc kubenswrapper[4696]: I0318 15:38:32.596844 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:32 crc kubenswrapper[4696]: I0318 15:38:32.596947 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:32 crc kubenswrapper[4696]: E0318 15:38:32.596982 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:38:32 crc kubenswrapper[4696]: E0318 15:38:32.597136 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:32 crc kubenswrapper[4696]: I0318 15:38:32.597215 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:32 crc kubenswrapper[4696]: E0318 15:38:32.597279 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:32 crc kubenswrapper[4696]: I0318 15:38:32.597350 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:32 crc kubenswrapper[4696]: E0318 15:38:32.597433 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:32 crc kubenswrapper[4696]: E0318 15:38:32.701226 4696 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:38:34 crc kubenswrapper[4696]: I0318 15:38:34.596560 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:34 crc kubenswrapper[4696]: I0318 15:38:34.596583 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:34 crc kubenswrapper[4696]: I0318 15:38:34.596623 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:34 crc kubenswrapper[4696]: E0318 15:38:34.596870 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:38:34 crc kubenswrapper[4696]: E0318 15:38:34.597069 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:34 crc kubenswrapper[4696]: I0318 15:38:34.596938 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:34 crc kubenswrapper[4696]: E0318 15:38:34.597469 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:34 crc kubenswrapper[4696]: E0318 15:38:34.597549 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:34 crc kubenswrapper[4696]: I0318 15:38:34.597884 4696 scope.go:117] "RemoveContainer" containerID="b0ed1a5792af0b624c478617b8f24a47b9144567df3a6e799394969b248847c5" Mar 18 15:38:35 crc kubenswrapper[4696]: I0318 15:38:35.326639 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lqxgs_1dc15d44-2b63-40b8-b9c8-dad533d01710/ovnkube-controller/2.log" Mar 18 15:38:35 crc kubenswrapper[4696]: I0318 15:38:35.330577 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" event={"ID":"1dc15d44-2b63-40b8-b9c8-dad533d01710","Type":"ContainerStarted","Data":"830374b73fa72ec6e84f666fa55509e4a61825915bc5f9ffec92a5bf4743e9cf"} Mar 18 15:38:35 crc kubenswrapper[4696]: I0318 15:38:35.331007 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:38:35 crc kubenswrapper[4696]: I0318 15:38:35.360596 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" podStartSLOduration=111.360568244 podStartE2EDuration="1m51.360568244s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:35.359445536 +0000 UTC m=+158.365619742" watchObservedRunningTime="2026-03-18 15:38:35.360568244 +0000 UTC m=+158.366742450" Mar 18 15:38:35 crc kubenswrapper[4696]: I0318 15:38:35.655281 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-k88c8"] Mar 18 15:38:35 crc kubenswrapper[4696]: I0318 15:38:35.655560 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:35 crc kubenswrapper[4696]: E0318 15:38:35.655873 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:38:36 crc kubenswrapper[4696]: I0318 15:38:36.597361 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:36 crc kubenswrapper[4696]: I0318 15:38:36.597460 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:36 crc kubenswrapper[4696]: E0318 15:38:36.597623 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:36 crc kubenswrapper[4696]: E0318 15:38:36.597710 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:36 crc kubenswrapper[4696]: I0318 15:38:36.597852 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:36 crc kubenswrapper[4696]: E0318 15:38:36.597932 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:37 crc kubenswrapper[4696]: I0318 15:38:37.596604 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:37 crc kubenswrapper[4696]: E0318 15:38:37.599030 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:38:37 crc kubenswrapper[4696]: E0318 15:38:37.701818 4696 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:38:38 crc kubenswrapper[4696]: I0318 15:38:38.596876 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:38 crc kubenswrapper[4696]: E0318 15:38:38.597298 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:38 crc kubenswrapper[4696]: I0318 15:38:38.597023 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:38 crc kubenswrapper[4696]: E0318 15:38:38.597506 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:38 crc kubenswrapper[4696]: I0318 15:38:38.598273 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:38 crc kubenswrapper[4696]: E0318 15:38:38.598454 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:39 crc kubenswrapper[4696]: I0318 15:38:39.597364 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:39 crc kubenswrapper[4696]: E0318 15:38:39.597686 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:38:40 crc kubenswrapper[4696]: I0318 15:38:40.596820 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:40 crc kubenswrapper[4696]: I0318 15:38:40.596873 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:40 crc kubenswrapper[4696]: E0318 15:38:40.597012 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:40 crc kubenswrapper[4696]: I0318 15:38:40.597105 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:40 crc kubenswrapper[4696]: E0318 15:38:40.597292 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:40 crc kubenswrapper[4696]: E0318 15:38:40.597420 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:41 crc kubenswrapper[4696]: I0318 15:38:41.596574 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:41 crc kubenswrapper[4696]: E0318 15:38:41.596758 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k88c8" podUID="701f97fc-e026-4b52-ac03-e4bccbf34972" Mar 18 15:38:42 crc kubenswrapper[4696]: I0318 15:38:42.556730 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:38:42 crc kubenswrapper[4696]: I0318 15:38:42.597432 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:42 crc kubenswrapper[4696]: I0318 15:38:42.597465 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:42 crc kubenswrapper[4696]: I0318 15:38:42.597505 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:42 crc kubenswrapper[4696]: E0318 15:38:42.597592 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:42 crc kubenswrapper[4696]: E0318 15:38:42.597718 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:42 crc kubenswrapper[4696]: E0318 15:38:42.597788 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:43 crc kubenswrapper[4696]: I0318 15:38:43.596478 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:43 crc kubenswrapper[4696]: I0318 15:38:43.598252 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 15:38:43 crc kubenswrapper[4696]: I0318 15:38:43.598986 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 15:38:44 crc kubenswrapper[4696]: I0318 15:38:44.596912 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:44 crc kubenswrapper[4696]: I0318 15:38:44.596966 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:44 crc kubenswrapper[4696]: I0318 15:38:44.596932 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:44 crc kubenswrapper[4696]: I0318 15:38:44.600929 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 15:38:44 crc kubenswrapper[4696]: I0318 15:38:44.602153 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 15:38:44 crc kubenswrapper[4696]: I0318 15:38:44.602564 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 15:38:44 crc kubenswrapper[4696]: I0318 15:38:44.605002 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.499545 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.539279 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-g5mnq"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.539787 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.539807 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g5mnq" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.540272 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.542462 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-75kz8"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.546903 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6k8ll"] Mar 18 15:38:49 crc kubenswrapper[4696]: W0318 15:38:49.547023 4696 reflector.go:561] object-"openshift-config-operator"/"config-operator-serving-cert": failed to list *v1.Secret: secrets "config-operator-serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-config-operator": no relationship found between node 'crc' and this object Mar 18 15:38:49 crc kubenswrapper[4696]: E0318 15:38:49.547062 4696 reflector.go:158] "Unhandled Error" err="object-\"openshift-config-operator\"/\"config-operator-serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"config-operator-serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 15:38:49 crc kubenswrapper[4696]: W0318 15:38:49.547068 4696 reflector.go:561] object-"openshift-config-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-config-operator": no relationship found between node 'crc' and this object Mar 18 15:38:49 crc kubenswrapper[4696]: E0318 15:38:49.547139 4696 reflector.go:158] "Unhandled Error" err="object-\"openshift-config-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.547149 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-75kz8" Mar 18 15:38:49 crc kubenswrapper[4696]: W0318 15:38:49.547268 4696 reflector.go:561] object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z": failed to list *v1.Secret: secrets "openshift-config-operator-dockercfg-7pc5z" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-config-operator": no relationship found between node 'crc' and this object Mar 18 15:38:49 crc kubenswrapper[4696]: E0318 15:38:49.547294 4696 reflector.go:158] "Unhandled Error" err="object-\"openshift-config-operator\"/\"openshift-config-operator-dockercfg-7pc5z\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-config-operator-dockercfg-7pc5z\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.547472 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.547687 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.547702 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vpp52"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.547803 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.547928 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.547941 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vpp52" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.548007 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.548037 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mcsh8"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.548013 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.548210 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.548095 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.548671 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.549080 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jnnmf"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.551504 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-485wk"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.550857 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.550965 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.551814 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-qbzqg"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.552160 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.552178 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-6xjgb"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.552237 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jnnmf" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.552324 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.553845 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-6xjgb" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.554174 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.558735 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.559130 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.559371 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.559977 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.561706 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.571134 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.571501 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.571900 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.572067 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.572118 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.573201 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.573431 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.573704 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.574012 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.574341 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.574392 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.575457 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.575895 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdd4w"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.576260 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.576709 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.576713 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.577090 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.577363 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.578327 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.578532 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.578569 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.578698 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.578782 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.580859 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.582633 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.582917 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.583302 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.583433 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.583999 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdd4w" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.583302 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.589588 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.589774 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.589877 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.589985 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.589879 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.590160 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.590699 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.590818 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.591814 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.592102 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.592362 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.592976 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8mtz5"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.593039 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.593153 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.593473 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-8mtz5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.594359 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.594823 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-498px"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.595160 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-498px" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.595419 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9wfks"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.596802 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.596859 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.597011 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.597149 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.597271 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.597280 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.597475 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.597591 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.597926 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.598255 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.600097 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g894v"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.600454 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-xk9kb"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.601487 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xk9kb" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.602892 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.604410 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-9wfks" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.612323 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.612578 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.612758 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.615304 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.616358 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.618555 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.623042 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-69fs4"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.623616 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4qhl8"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.624003 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gxl5n"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.631859 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-69fs4" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.632244 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr7hv"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.633156 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bvmx5"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.633646 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.634081 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-n7svq"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.635057 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4qhl8" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.635117 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gxl5n" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.635065 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n7svq" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.635463 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr7hv" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.636183 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvmx5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.639347 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r4wjj"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.642046 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.643195 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.643586 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.644331 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.644348 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.644514 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.645188 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.645490 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.645683 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.646047 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.646097 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.646435 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.646454 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.646488 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.646944 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.647341 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.647558 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-x2rv5"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.647582 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.647903 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.648144 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.651732 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.670545 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r4wjj" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.670732 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.671922 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-x2rv5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.672892 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.673038 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.673288 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.674289 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564130-4tdt6"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.675001 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.677146 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-4tdt6" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.678861 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.681104 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-rksxf"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.681153 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.681805 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.681970 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.682649 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-m4tfr"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.683029 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7p8b"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.683479 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rksxf" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.683549 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5ljxk"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.683927 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5r8lf"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.683971 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4tfr" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.684015 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7p8b" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.684044 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5ljxk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.684892 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5r8lf" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.691103 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ljcg6"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.693241 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.693650 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbrtt\" (UniqueName: \"kubernetes.io/projected/80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b-kube-api-access-zbrtt\") pod \"machine-config-operator-74547568cd-bvmx5\" (UID: \"80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvmx5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.693690 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d80e308e-71d5-484e-bdc3-ac15ef240b46-etcd-ca\") pod \"etcd-operator-b45778765-8mtz5\" (UID: \"d80e308e-71d5-484e-bdc3-ac15ef240b46\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8mtz5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.693711 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b-proxy-tls\") pod \"machine-config-operator-74547568cd-bvmx5\" (UID: \"80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvmx5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.693742 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce730c10-9854-4705-bac9-07fc1f23402c-config\") pod \"controller-manager-879f6c89f-mcsh8\" (UID: \"ce730c10-9854-4705-bac9-07fc1f23402c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.693758 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.693785 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.693803 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d80e308e-71d5-484e-bdc3-ac15ef240b46-config\") pod \"etcd-operator-b45778765-8mtz5\" (UID: \"d80e308e-71d5-484e-bdc3-ac15ef240b46\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8mtz5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.693827 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh9kd\" (UniqueName: \"kubernetes.io/projected/ce730c10-9854-4705-bac9-07fc1f23402c-kube-api-access-nh9kd\") pod \"controller-manager-879f6c89f-mcsh8\" (UID: \"ce730c10-9854-4705-bac9-07fc1f23402c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.693852 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dflh7\" (UniqueName: \"kubernetes.io/projected/6b9aec63-9194-4040-b89c-6985e68607b9-kube-api-access-dflh7\") pod \"openshift-config-operator-7777fb866f-g5mnq\" (UID: \"6b9aec63-9194-4040-b89c-6985e68607b9\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g5mnq" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.693867 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65110067-19f3-4355-bac5-fe08d6a07311-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jnnmf\" (UID: \"65110067-19f3-4355-bac5-fe08d6a07311\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jnnmf" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.693882 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d80e308e-71d5-484e-bdc3-ac15ef240b46-etcd-service-ca\") pod \"etcd-operator-b45778765-8mtz5\" (UID: \"d80e308e-71d5-484e-bdc3-ac15ef240b46\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8mtz5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.693899 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.693925 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.693939 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65110067-19f3-4355-bac5-fe08d6a07311-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jnnmf\" (UID: \"65110067-19f3-4355-bac5-fe08d6a07311\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jnnmf" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.693960 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d80e308e-71d5-484e-bdc3-ac15ef240b46-etcd-client\") pod \"etcd-operator-b45778765-8mtz5\" (UID: \"d80e308e-71d5-484e-bdc3-ac15ef240b46\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8mtz5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.693976 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vqbf\" (UniqueName: \"kubernetes.io/projected/d80e308e-71d5-484e-bdc3-ac15ef240b46-kube-api-access-9vqbf\") pod \"etcd-operator-b45778765-8mtz5\" (UID: \"d80e308e-71d5-484e-bdc3-ac15ef240b46\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8mtz5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.694000 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmlfq\" (UniqueName: \"kubernetes.io/projected/a589c8ef-17db-4df7-affb-8a40c753aaaa-kube-api-access-qmlfq\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.694015 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.694032 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b-images\") pod \"machine-config-operator-74547568cd-bvmx5\" (UID: \"80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvmx5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.694049 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bvmx5\" (UID: \"80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvmx5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.694065 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/6b9aec63-9194-4040-b89c-6985e68607b9-available-featuregates\") pod \"openshift-config-operator-7777fb866f-g5mnq\" (UID: \"6b9aec63-9194-4040-b89c-6985e68607b9\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g5mnq" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.694089 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d80e308e-71d5-484e-bdc3-ac15ef240b46-serving-cert\") pod \"etcd-operator-b45778765-8mtz5\" (UID: \"d80e308e-71d5-484e-bdc3-ac15ef240b46\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8mtz5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.694105 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a589c8ef-17db-4df7-affb-8a40c753aaaa-audit-policies\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.694120 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.694134 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.694152 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.694170 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce730c10-9854-4705-bac9-07fc1f23402c-serving-cert\") pod \"controller-manager-879f6c89f-mcsh8\" (UID: \"ce730c10-9854-4705-bac9-07fc1f23402c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.694188 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.694202 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a589c8ef-17db-4df7-affb-8a40c753aaaa-audit-dir\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.694216 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.694231 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.694246 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9g9c\" (UniqueName: \"kubernetes.io/projected/65110067-19f3-4355-bac5-fe08d6a07311-kube-api-access-l9g9c\") pod \"openshift-controller-manager-operator-756b6f6bc6-jnnmf\" (UID: \"65110067-19f3-4355-bac5-fe08d6a07311\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jnnmf" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.694262 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ce730c10-9854-4705-bac9-07fc1f23402c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mcsh8\" (UID: \"ce730c10-9854-4705-bac9-07fc1f23402c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.694278 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b9aec63-9194-4040-b89c-6985e68607b9-serving-cert\") pod \"openshift-config-operator-7777fb866f-g5mnq\" (UID: \"6b9aec63-9194-4040-b89c-6985e68607b9\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g5mnq" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.694294 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce730c10-9854-4705-bac9-07fc1f23402c-client-ca\") pod \"controller-manager-879f6c89f-mcsh8\" (UID: \"ce730c10-9854-4705-bac9-07fc1f23402c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.701385 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vq9f"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.701919 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gx9sb"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.702341 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dshv6"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.702593 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gx9sb" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.702896 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vq9f" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.703151 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ljcg6" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.703173 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dshv6" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.703421 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.703689 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9cb8"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.704168 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9cb8" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.707530 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ff562"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.708321 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ff562" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.713949 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rgwf8"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.716054 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564138-jzn9j"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.716995 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-vgrlm"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.717967 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-vgrlm" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.718387 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rgwf8" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.718621 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564138-jzn9j" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.719225 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vpp52"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.719666 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-g5mnq"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.722729 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.728465 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8286f"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.729367 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8mtz5"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.729451 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8286f" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.736928 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mcsh8"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.737700 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-75kz8"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.743942 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.746155 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.748757 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-6xjgb"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.748802 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9wfks"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.750449 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6k8ll"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.751872 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-69fs4"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.753982 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr7hv"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.764088 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4qhl8"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.768003 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.768300 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r4wjj"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.770339 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-m4tfr"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.771464 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.772793 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-485wk"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.774298 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-n7svq"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.775667 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bvmx5"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.777240 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-x2rv5"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.779902 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdd4w"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.781596 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.782155 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9cb8"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.783911 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-qbzqg"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.785317 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gxl5n"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.786658 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564138-jzn9j"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.788497 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564130-4tdt6"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.789400 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-498px"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.790788 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5ljxk"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.792074 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jnnmf"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.793538 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7p8b"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.794779 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4bxnr"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.795343 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.795393 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce730c10-9854-4705-bac9-07fc1f23402c-config\") pod \"controller-manager-879f6c89f-mcsh8\" (UID: \"ce730c10-9854-4705-bac9-07fc1f23402c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.795421 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.795456 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d80e308e-71d5-484e-bdc3-ac15ef240b46-config\") pod \"etcd-operator-b45778765-8mtz5\" (UID: \"d80e308e-71d5-484e-bdc3-ac15ef240b46\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8mtz5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.795507 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh9kd\" (UniqueName: \"kubernetes.io/projected/ce730c10-9854-4705-bac9-07fc1f23402c-kube-api-access-nh9kd\") pod \"controller-manager-879f6c89f-mcsh8\" (UID: \"ce730c10-9854-4705-bac9-07fc1f23402c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.795547 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dflh7\" (UniqueName: \"kubernetes.io/projected/6b9aec63-9194-4040-b89c-6985e68607b9-kube-api-access-dflh7\") pod \"openshift-config-operator-7777fb866f-g5mnq\" (UID: \"6b9aec63-9194-4040-b89c-6985e68607b9\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g5mnq" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.795566 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65110067-19f3-4355-bac5-fe08d6a07311-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jnnmf\" (UID: \"65110067-19f3-4355-bac5-fe08d6a07311\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jnnmf" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.795586 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d80e308e-71d5-484e-bdc3-ac15ef240b46-etcd-service-ca\") pod \"etcd-operator-b45778765-8mtz5\" (UID: \"d80e308e-71d5-484e-bdc3-ac15ef240b46\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8mtz5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.795606 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.795627 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.795651 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65110067-19f3-4355-bac5-fe08d6a07311-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jnnmf\" (UID: \"65110067-19f3-4355-bac5-fe08d6a07311\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jnnmf" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.795760 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d80e308e-71d5-484e-bdc3-ac15ef240b46-etcd-client\") pod \"etcd-operator-b45778765-8mtz5\" (UID: \"d80e308e-71d5-484e-bdc3-ac15ef240b46\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8mtz5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.795822 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vqbf\" (UniqueName: \"kubernetes.io/projected/d80e308e-71d5-484e-bdc3-ac15ef240b46-kube-api-access-9vqbf\") pod \"etcd-operator-b45778765-8mtz5\" (UID: \"d80e308e-71d5-484e-bdc3-ac15ef240b46\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8mtz5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.795873 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmlfq\" (UniqueName: \"kubernetes.io/projected/a589c8ef-17db-4df7-affb-8a40c753aaaa-kube-api-access-qmlfq\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.795927 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.795959 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b-images\") pod \"machine-config-operator-74547568cd-bvmx5\" (UID: \"80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvmx5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.795996 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bvmx5\" (UID: \"80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvmx5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.796025 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a589c8ef-17db-4df7-affb-8a40c753aaaa-audit-policies\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.796055 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/6b9aec63-9194-4040-b89c-6985e68607b9-available-featuregates\") pod \"openshift-config-operator-7777fb866f-g5mnq\" (UID: \"6b9aec63-9194-4040-b89c-6985e68607b9\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g5mnq" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.796069 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.796080 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d80e308e-71d5-484e-bdc3-ac15ef240b46-serving-cert\") pod \"etcd-operator-b45778765-8mtz5\" (UID: \"d80e308e-71d5-484e-bdc3-ac15ef240b46\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8mtz5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.796103 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.796124 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.796150 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce730c10-9854-4705-bac9-07fc1f23402c-serving-cert\") pod \"controller-manager-879f6c89f-mcsh8\" (UID: \"ce730c10-9854-4705-bac9-07fc1f23402c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.796174 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.796200 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.796226 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.796279 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.796306 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9g9c\" (UniqueName: \"kubernetes.io/projected/65110067-19f3-4355-bac5-fe08d6a07311-kube-api-access-l9g9c\") pod \"openshift-controller-manager-operator-756b6f6bc6-jnnmf\" (UID: \"65110067-19f3-4355-bac5-fe08d6a07311\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jnnmf" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.796333 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a589c8ef-17db-4df7-affb-8a40c753aaaa-audit-dir\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.796336 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d80e308e-71d5-484e-bdc3-ac15ef240b46-config\") pod \"etcd-operator-b45778765-8mtz5\" (UID: \"d80e308e-71d5-484e-bdc3-ac15ef240b46\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8mtz5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.796357 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ce730c10-9854-4705-bac9-07fc1f23402c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mcsh8\" (UID: \"ce730c10-9854-4705-bac9-07fc1f23402c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.796399 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b9aec63-9194-4040-b89c-6985e68607b9-serving-cert\") pod \"openshift-config-operator-7777fb866f-g5mnq\" (UID: \"6b9aec63-9194-4040-b89c-6985e68607b9\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g5mnq" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.796428 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce730c10-9854-4705-bac9-07fc1f23402c-client-ca\") pod \"controller-manager-879f6c89f-mcsh8\" (UID: \"ce730c10-9854-4705-bac9-07fc1f23402c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.796462 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbrtt\" (UniqueName: \"kubernetes.io/projected/80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b-kube-api-access-zbrtt\") pod \"machine-config-operator-74547568cd-bvmx5\" (UID: \"80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvmx5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.796489 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d80e308e-71d5-484e-bdc3-ac15ef240b46-etcd-ca\") pod \"etcd-operator-b45778765-8mtz5\" (UID: \"d80e308e-71d5-484e-bdc3-ac15ef240b46\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8mtz5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.796512 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b-proxy-tls\") pod \"machine-config-operator-74547568cd-bvmx5\" (UID: \"80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvmx5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.796543 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d80e308e-71d5-484e-bdc3-ac15ef240b46-etcd-service-ca\") pod \"etcd-operator-b45778765-8mtz5\" (UID: \"d80e308e-71d5-484e-bdc3-ac15ef240b46\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8mtz5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.797381 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.798243 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce730c10-9854-4705-bac9-07fc1f23402c-config\") pod \"controller-manager-879f6c89f-mcsh8\" (UID: \"ce730c10-9854-4705-bac9-07fc1f23402c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.798242 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/6b9aec63-9194-4040-b89c-6985e68607b9-available-featuregates\") pod \"openshift-config-operator-7777fb866f-g5mnq\" (UID: \"6b9aec63-9194-4040-b89c-6985e68607b9\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g5mnq" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.799415 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.799748 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a589c8ef-17db-4df7-affb-8a40c753aaaa-audit-policies\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.799795 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2x748"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.800092 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a589c8ef-17db-4df7-affb-8a40c753aaaa-audit-dir\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.800323 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65110067-19f3-4355-bac5-fe08d6a07311-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jnnmf\" (UID: \"65110067-19f3-4355-bac5-fe08d6a07311\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jnnmf" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.800365 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d80e308e-71d5-484e-bdc3-ac15ef240b46-etcd-ca\") pod \"etcd-operator-b45778765-8mtz5\" (UID: \"d80e308e-71d5-484e-bdc3-ac15ef240b46\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8mtz5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.800810 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g894v"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.800876 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce730c10-9854-4705-bac9-07fc1f23402c-client-ca\") pod \"controller-manager-879f6c89f-mcsh8\" (UID: \"ce730c10-9854-4705-bac9-07fc1f23402c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.801082 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2x748" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.801413 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8286f"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.801777 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bvmx5\" (UID: \"80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvmx5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.801889 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.802240 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.802925 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ff562"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.803092 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65110067-19f3-4355-bac5-fe08d6a07311-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jnnmf\" (UID: \"65110067-19f3-4355-bac5-fe08d6a07311\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jnnmf" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.803385 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce730c10-9854-4705-bac9-07fc1f23402c-serving-cert\") pod \"controller-manager-879f6c89f-mcsh8\" (UID: \"ce730c10-9854-4705-bac9-07fc1f23402c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.803814 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d80e308e-71d5-484e-bdc3-ac15ef240b46-etcd-client\") pod \"etcd-operator-b45778765-8mtz5\" (UID: \"d80e308e-71d5-484e-bdc3-ac15ef240b46\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8mtz5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.803831 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.803873 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ce730c10-9854-4705-bac9-07fc1f23402c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mcsh8\" (UID: \"ce730c10-9854-4705-bac9-07fc1f23402c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.804175 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.804323 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gx9sb"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.805256 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.805634 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rgwf8"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.806619 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vq9f"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.807295 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.808406 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ljcg6"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.809501 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dshv6"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.810555 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5r8lf"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.811592 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4bxnr"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.811910 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.812603 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.812849 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.812866 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d80e308e-71d5-484e-bdc3-ac15ef240b46-serving-cert\") pod \"etcd-operator-b45778765-8mtz5\" (UID: \"d80e308e-71d5-484e-bdc3-ac15ef240b46\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8mtz5" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.813067 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2x748"] Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.814460 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.822106 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.842302 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.864699 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.882444 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.902043 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.922399 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.942427 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.962876 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.982472 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 15:38:49 crc kubenswrapper[4696]: I0318 15:38:49.987604 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b-images\") pod \"machine-config-operator-74547568cd-bvmx5\" (UID: \"80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvmx5" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.002137 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.021755 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.042906 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.053128 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b-proxy-tls\") pod \"machine-config-operator-74547568cd-bvmx5\" (UID: \"80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvmx5" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.082140 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.102173 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.122902 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.149257 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.162581 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.182371 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.201758 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.221443 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.241890 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.261211 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.282483 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.302410 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.342320 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.362374 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.381551 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.402641 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.421910 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.442171 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.462431 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.482016 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.502643 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.522437 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.542210 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.561915 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.582209 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.601486 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.622167 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.641325 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.662635 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.681277 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.700260 4696 request.go:700] Waited for 1.015674612s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca-operator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.701742 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.721825 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.742585 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.761933 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.782370 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 15:38:50 crc kubenswrapper[4696]: E0318 15:38:50.799642 4696 secret.go:188] Couldn't get secret openshift-config-operator/config-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 18 15:38:50 crc kubenswrapper[4696]: E0318 15:38:50.799730 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b9aec63-9194-4040-b89c-6985e68607b9-serving-cert podName:6b9aec63-9194-4040-b89c-6985e68607b9 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:51.299710287 +0000 UTC m=+174.305884493 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/6b9aec63-9194-4040-b89c-6985e68607b9-serving-cert") pod "openshift-config-operator-7777fb866f-g5mnq" (UID: "6b9aec63-9194-4040-b89c-6985e68607b9") : failed to sync secret cache: timed out waiting for the condition Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.801860 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.821880 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.843028 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.890396 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.890670 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.902348 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.922188 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.942174 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.962885 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 15:38:50 crc kubenswrapper[4696]: I0318 15:38:50.982734 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.003011 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.034397 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.042133 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.062904 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.082098 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.102354 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.122201 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.142466 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.163056 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.182941 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.203061 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.222481 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.242997 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.262798 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.282287 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.303498 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.311333 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b9aec63-9194-4040-b89c-6985e68607b9-serving-cert\") pod \"openshift-config-operator-7777fb866f-g5mnq\" (UID: \"6b9aec63-9194-4040-b89c-6985e68607b9\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g5mnq" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.323236 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.343222 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.363342 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.382639 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.403339 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.423171 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.443012 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.462975 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.483395 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.543718 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.545756 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh9kd\" (UniqueName: \"kubernetes.io/projected/ce730c10-9854-4705-bac9-07fc1f23402c-kube-api-access-nh9kd\") pod \"controller-manager-879f6c89f-mcsh8\" (UID: \"ce730c10-9854-4705-bac9-07fc1f23402c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.564229 4696 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.582412 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.623826 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vqbf\" (UniqueName: \"kubernetes.io/projected/d80e308e-71d5-484e-bdc3-ac15ef240b46-kube-api-access-9vqbf\") pod \"etcd-operator-b45778765-8mtz5\" (UID: \"d80e308e-71d5-484e-bdc3-ac15ef240b46\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8mtz5" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.638243 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmlfq\" (UniqueName: \"kubernetes.io/projected/a589c8ef-17db-4df7-affb-8a40c753aaaa-kube-api-access-qmlfq\") pod \"oauth-openshift-558db77b4-485wk\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.659151 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9g9c\" (UniqueName: \"kubernetes.io/projected/65110067-19f3-4355-bac5-fe08d6a07311-kube-api-access-l9g9c\") pod \"openshift-controller-manager-operator-756b6f6bc6-jnnmf\" (UID: \"65110067-19f3-4355-bac5-fe08d6a07311\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jnnmf" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.678313 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbrtt\" (UniqueName: \"kubernetes.io/projected/80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b-kube-api-access-zbrtt\") pod \"machine-config-operator-74547568cd-bvmx5\" (UID: \"80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvmx5" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.682660 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.700302 4696 request.go:700] Waited for 1.89889529s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Ddefault-dockercfg-2llfx&limit=500&resourceVersion=0 Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.702936 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.704315 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.722196 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.741645 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.803767 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.819629 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/363b3949-b8a3-4fd4-a13d-281d29c61822-node-pullsecrets\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.820160 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90a9c50c-4a8b-4731-9ee8-addc5dfb22f7-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vpp52\" (UID: \"90a9c50c-4a8b-4731-9ee8-addc5dfb22f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vpp52" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.820188 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c38fdccb-264a-4d02-9f5c-19140a3df2f2-config\") pod \"console-operator-58897d9998-9wfks\" (UID: \"c38fdccb-264a-4d02-9f5c-19140a3df2f2\") " pod="openshift-console-operator/console-operator-58897d9998-9wfks" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.820214 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09cb0a8a-4464-4d3c-93a4-0d933d0fff77-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gxl5n\" (UID: \"09cb0a8a-4464-4d3c-93a4-0d933d0fff77\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gxl5n" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.820239 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c05edf1-5079-4212-ba5c-19621b2500cf-config\") pod \"machine-api-operator-5694c8668f-6xjgb\" (UID: \"0c05edf1-5079-4212-ba5c-19621b2500cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6xjgb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.820264 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/363b3949-b8a3-4fd4-a13d-281d29c61822-encryption-config\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.820294 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09cb0a8a-4464-4d3c-93a4-0d933d0fff77-config\") pod \"kube-controller-manager-operator-78b949d7b-gxl5n\" (UID: \"09cb0a8a-4464-4d3c-93a4-0d933d0fff77\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gxl5n" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.820318 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/363b3949-b8a3-4fd4-a13d-281d29c61822-serving-cert\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.820340 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec637495-0e89-4ff0-9f59-2079144aa380-serving-cert\") pod \"authentication-operator-69f744f599-75kz8\" (UID: \"ec637495-0e89-4ff0-9f59-2079144aa380\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kz8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.820367 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/363b3949-b8a3-4fd4-a13d-281d29c61822-audit-dir\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.820398 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.820419 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/363b3949-b8a3-4fd4-a13d-281d29c61822-etcd-serving-ca\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.820438 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zthn2\" (UniqueName: \"kubernetes.io/projected/363b3949-b8a3-4fd4-a13d-281d29c61822-kube-api-access-zthn2\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.820457 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/363b3949-b8a3-4fd4-a13d-281d29c61822-config\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.820504 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/94443ebd-69c4-4f6b-90a6-13cd2da51741-stats-auth\") pod \"router-default-5444994796-xk9kb\" (UID: \"94443ebd-69c4-4f6b-90a6-13cd2da51741\") " pod="openshift-ingress/router-default-5444994796-xk9kb" Mar 18 15:38:51 crc kubenswrapper[4696]: E0318 15:38:51.820879 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:52.320861487 +0000 UTC m=+175.327035693 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821147 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/363b3949-b8a3-4fd4-a13d-281d29c61822-image-import-ca\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821191 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3733dd99-82f2-4602-b0e2-ece3c16cd446-service-ca\") pod \"console-f9d7485db-qbzqg\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821223 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/914a7337-c621-43a6-8294-50390f768e28-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-n7svq\" (UID: \"914a7337-c621-43a6-8294-50390f768e28\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n7svq" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821250 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5744ac11-6c36-4634-903e-298dc7b5ce45-ca-trust-extracted\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821273 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/363b3949-b8a3-4fd4-a13d-281d29c61822-audit\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821294 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3733dd99-82f2-4602-b0e2-ece3c16cd446-console-config\") pod \"console-f9d7485db-qbzqg\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821321 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0c05edf1-5079-4212-ba5c-19621b2500cf-images\") pod \"machine-api-operator-5694c8668f-6xjgb\" (UID: \"0c05edf1-5079-4212-ba5c-19621b2500cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6xjgb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821347 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5f87\" (UniqueName: \"kubernetes.io/projected/f7245cec-cee9-4aa5-8087-81ad2f450977-kube-api-access-c5f87\") pod \"downloads-7954f5f757-498px\" (UID: \"f7245cec-cee9-4aa5-8087-81ad2f450977\") " pod="openshift-console/downloads-7954f5f757-498px" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821373 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6eeff11-0e98-48f5-a868-a7016d57be14-serving-cert\") pod \"route-controller-manager-6576b87f9c-kn2g2\" (UID: \"a6eeff11-0e98-48f5-a868-a7016d57be14\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821395 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3733dd99-82f2-4602-b0e2-ece3c16cd446-console-oauth-config\") pod \"console-f9d7485db-qbzqg\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821420 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zspsz\" (UniqueName: \"kubernetes.io/projected/90a9c50c-4a8b-4731-9ee8-addc5dfb22f7-kube-api-access-zspsz\") pod \"openshift-apiserver-operator-796bbdcf4f-vpp52\" (UID: \"90a9c50c-4a8b-4731-9ee8-addc5dfb22f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vpp52" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821464 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmj4r\" (UniqueName: \"kubernetes.io/projected/5744ac11-6c36-4634-903e-298dc7b5ce45-kube-api-access-dmj4r\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821490 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pk8t\" (UniqueName: \"kubernetes.io/projected/0c05edf1-5079-4212-ba5c-19621b2500cf-kube-api-access-6pk8t\") pod \"machine-api-operator-5694c8668f-6xjgb\" (UID: \"0c05edf1-5079-4212-ba5c-19621b2500cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6xjgb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821532 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t98l6\" (UniqueName: \"kubernetes.io/projected/c38fdccb-264a-4d02-9f5c-19140a3df2f2-kube-api-access-t98l6\") pod \"console-operator-58897d9998-9wfks\" (UID: \"c38fdccb-264a-4d02-9f5c-19140a3df2f2\") " pod="openshift-console-operator/console-operator-58897d9998-9wfks" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821562 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6eeff11-0e98-48f5-a868-a7016d57be14-config\") pod \"route-controller-manager-6576b87f9c-kn2g2\" (UID: \"a6eeff11-0e98-48f5-a868-a7016d57be14\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821583 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6eeff11-0e98-48f5-a868-a7016d57be14-client-ca\") pod \"route-controller-manager-6576b87f9c-kn2g2\" (UID: \"a6eeff11-0e98-48f5-a868-a7016d57be14\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821607 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c38fdccb-264a-4d02-9f5c-19140a3df2f2-serving-cert\") pod \"console-operator-58897d9998-9wfks\" (UID: \"c38fdccb-264a-4d02-9f5c-19140a3df2f2\") " pod="openshift-console-operator/console-operator-58897d9998-9wfks" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821690 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28cd9598-75cc-4316-a078-c4369354b5af-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cr7hv\" (UID: \"28cd9598-75cc-4316-a078-c4369354b5af\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr7hv" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821726 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k8gk\" (UniqueName: \"kubernetes.io/projected/102d31d1-4cba-46d5-8f36-727dd4379b90-kube-api-access-4k8gk\") pod \"migrator-59844c95c7-4qhl8\" (UID: \"102d31d1-4cba-46d5-8f36-727dd4379b90\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4qhl8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821749 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3733dd99-82f2-4602-b0e2-ece3c16cd446-console-serving-cert\") pod \"console-f9d7485db-qbzqg\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821771 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75qfg\" (UniqueName: \"kubernetes.io/projected/3733dd99-82f2-4602-b0e2-ece3c16cd446-kube-api-access-75qfg\") pod \"console-f9d7485db-qbzqg\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821795 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtvvw\" (UniqueName: \"kubernetes.io/projected/55f95b82-6e61-4b39-a2e6-6685d37d61e8-kube-api-access-wtvvw\") pod \"dns-operator-744455d44c-69fs4\" (UID: \"55f95b82-6e61-4b39-a2e6-6685d37d61e8\") " pod="openshift-dns-operator/dns-operator-744455d44c-69fs4" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821817 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/28cd9598-75cc-4316-a078-c4369354b5af-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cr7hv\" (UID: \"28cd9598-75cc-4316-a078-c4369354b5af\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr7hv" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821841 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec637495-0e89-4ff0-9f59-2079144aa380-config\") pod \"authentication-operator-69f744f599-75kz8\" (UID: \"ec637495-0e89-4ff0-9f59-2079144aa380\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kz8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821874 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/28cd9598-75cc-4316-a078-c4369354b5af-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cr7hv\" (UID: \"28cd9598-75cc-4316-a078-c4369354b5af\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr7hv" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821901 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/94443ebd-69c4-4f6b-90a6-13cd2da51741-default-certificate\") pod \"router-default-5444994796-xk9kb\" (UID: \"94443ebd-69c4-4f6b-90a6-13cd2da51741\") " pod="openshift-ingress/router-default-5444994796-xk9kb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821925 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/914a7337-c621-43a6-8294-50390f768e28-proxy-tls\") pod \"machine-config-controller-84d6567774-n7svq\" (UID: \"914a7337-c621-43a6-8294-50390f768e28\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n7svq" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821950 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5744ac11-6c36-4634-903e-298dc7b5ce45-registry-tls\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821974 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwglp\" (UniqueName: \"kubernetes.io/projected/a6eeff11-0e98-48f5-a868-a7016d57be14-kube-api-access-jwglp\") pod \"route-controller-manager-6576b87f9c-kn2g2\" (UID: \"a6eeff11-0e98-48f5-a868-a7016d57be14\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.821997 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/55f95b82-6e61-4b39-a2e6-6685d37d61e8-metrics-tls\") pod \"dns-operator-744455d44c-69fs4\" (UID: \"55f95b82-6e61-4b39-a2e6-6685d37d61e8\") " pod="openshift-dns-operator/dns-operator-744455d44c-69fs4" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.822024 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5744ac11-6c36-4634-903e-298dc7b5ce45-registry-certificates\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.822051 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94443ebd-69c4-4f6b-90a6-13cd2da51741-metrics-certs\") pod \"router-default-5444994796-xk9kb\" (UID: \"94443ebd-69c4-4f6b-90a6-13cd2da51741\") " pod="openshift-ingress/router-default-5444994796-xk9kb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.822075 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec637495-0e89-4ff0-9f59-2079144aa380-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-75kz8\" (UID: \"ec637495-0e89-4ff0-9f59-2079144aa380\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kz8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.822102 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09cb0a8a-4464-4d3c-93a4-0d933d0fff77-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gxl5n\" (UID: \"09cb0a8a-4464-4d3c-93a4-0d933d0fff77\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gxl5n" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.822129 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc6gb\" (UniqueName: \"kubernetes.io/projected/ec637495-0e89-4ff0-9f59-2079144aa380-kube-api-access-tc6gb\") pod \"authentication-operator-69f744f599-75kz8\" (UID: \"ec637495-0e89-4ff0-9f59-2079144aa380\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kz8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.822168 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5744ac11-6c36-4634-903e-298dc7b5ce45-bound-sa-token\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.822193 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90a9c50c-4a8b-4731-9ee8-addc5dfb22f7-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vpp52\" (UID: \"90a9c50c-4a8b-4731-9ee8-addc5dfb22f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vpp52" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.822214 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3733dd99-82f2-4602-b0e2-ece3c16cd446-oauth-serving-cert\") pod \"console-f9d7485db-qbzqg\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.822234 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94443ebd-69c4-4f6b-90a6-13cd2da51741-service-ca-bundle\") pod \"router-default-5444994796-xk9kb\" (UID: \"94443ebd-69c4-4f6b-90a6-13cd2da51741\") " pod="openshift-ingress/router-default-5444994796-xk9kb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.822253 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c38fdccb-264a-4d02-9f5c-19140a3df2f2-trusted-ca\") pod \"console-operator-58897d9998-9wfks\" (UID: \"c38fdccb-264a-4d02-9f5c-19140a3df2f2\") " pod="openshift-console-operator/console-operator-58897d9998-9wfks" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.822290 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5744ac11-6c36-4634-903e-298dc7b5ce45-trusted-ca\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.822314 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95cnx\" (UniqueName: \"kubernetes.io/projected/94443ebd-69c4-4f6b-90a6-13cd2da51741-kube-api-access-95cnx\") pod \"router-default-5444994796-xk9kb\" (UID: \"94443ebd-69c4-4f6b-90a6-13cd2da51741\") " pod="openshift-ingress/router-default-5444994796-xk9kb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.822343 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec637495-0e89-4ff0-9f59-2079144aa380-service-ca-bundle\") pod \"authentication-operator-69f744f599-75kz8\" (UID: \"ec637495-0e89-4ff0-9f59-2079144aa380\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kz8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.822368 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28d5l\" (UniqueName: \"kubernetes.io/projected/28cd9598-75cc-4316-a078-c4369354b5af-kube-api-access-28d5l\") pod \"cluster-image-registry-operator-dc59b4c8b-cr7hv\" (UID: \"28cd9598-75cc-4316-a078-c4369354b5af\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr7hv" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.822392 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0c05edf1-5079-4212-ba5c-19621b2500cf-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6xjgb\" (UID: \"0c05edf1-5079-4212-ba5c-19621b2500cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6xjgb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.822421 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8tkm\" (UniqueName: \"kubernetes.io/projected/914a7337-c621-43a6-8294-50390f768e28-kube-api-access-h8tkm\") pod \"machine-config-controller-84d6567774-n7svq\" (UID: \"914a7337-c621-43a6-8294-50390f768e28\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n7svq" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.822661 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5744ac11-6c36-4634-903e-298dc7b5ce45-installation-pull-secrets\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.822701 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3733dd99-82f2-4602-b0e2-ece3c16cd446-trusted-ca-bundle\") pod \"console-f9d7485db-qbzqg\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.822918 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/363b3949-b8a3-4fd4-a13d-281d29c61822-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.822978 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/363b3949-b8a3-4fd4-a13d-281d29c61822-etcd-client\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.823023 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k779w\" (UniqueName: \"kubernetes.io/projected/198822b4-2fd5-4225-bfe8-f233eaf3d8fc-kube-api-access-k779w\") pod \"cluster-samples-operator-665b6dd947-gdd4w\" (UID: \"198822b4-2fd5-4225-bfe8-f233eaf3d8fc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdd4w" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.823056 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/198822b4-2fd5-4225-bfe8-f233eaf3d8fc-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gdd4w\" (UID: \"198822b4-2fd5-4225-bfe8-f233eaf3d8fc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdd4w" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.823080 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.828723 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jnnmf" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.833844 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dflh7\" (UniqueName: \"kubernetes.io/projected/6b9aec63-9194-4040-b89c-6985e68607b9-kube-api-access-dflh7\") pod \"openshift-config-operator-7777fb866f-g5mnq\" (UID: \"6b9aec63-9194-4040-b89c-6985e68607b9\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g5mnq" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.839725 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.842942 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.855849 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b9aec63-9194-4040-b89c-6985e68607b9-serving-cert\") pod \"openshift-config-operator-7777fb866f-g5mnq\" (UID: \"6b9aec63-9194-4040-b89c-6985e68607b9\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g5mnq" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.871149 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-8mtz5" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.894459 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mcsh8"] Mar 18 15:38:51 crc kubenswrapper[4696]: W0318 15:38:51.900496 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce730c10_9854_4705_bac9_07fc1f23402c.slice/crio-cf7fd7a402f91b8d41dbf8d6de7c2936280f5d9cddbbad7090bd757d975e6807 WatchSource:0}: Error finding container cf7fd7a402f91b8d41dbf8d6de7c2936280f5d9cddbbad7090bd757d975e6807: Status 404 returned error can't find the container with id cf7fd7a402f91b8d41dbf8d6de7c2936280f5d9cddbbad7090bd757d975e6807 Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.924623 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:51 crc kubenswrapper[4696]: E0318 15:38:51.925016 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:52.424987518 +0000 UTC m=+175.431161734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925170 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5744ac11-6c36-4634-903e-298dc7b5ce45-bound-sa-token\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925208 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3733dd99-82f2-4602-b0e2-ece3c16cd446-oauth-serving-cert\") pod \"console-f9d7485db-qbzqg\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925247 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5744ac11-6c36-4634-903e-298dc7b5ce45-trusted-ca\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925274 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95cnx\" (UniqueName: \"kubernetes.io/projected/94443ebd-69c4-4f6b-90a6-13cd2da51741-kube-api-access-95cnx\") pod \"router-default-5444994796-xk9kb\" (UID: \"94443ebd-69c4-4f6b-90a6-13cd2da51741\") " pod="openshift-ingress/router-default-5444994796-xk9kb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925298 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec637495-0e89-4ff0-9f59-2079144aa380-service-ca-bundle\") pod \"authentication-operator-69f744f599-75kz8\" (UID: \"ec637495-0e89-4ff0-9f59-2079144aa380\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kz8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925321 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28d5l\" (UniqueName: \"kubernetes.io/projected/28cd9598-75cc-4316-a078-c4369354b5af-kube-api-access-28d5l\") pod \"cluster-image-registry-operator-dc59b4c8b-cr7hv\" (UID: \"28cd9598-75cc-4316-a078-c4369354b5af\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr7hv" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925348 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8tkm\" (UniqueName: \"kubernetes.io/projected/914a7337-c621-43a6-8294-50390f768e28-kube-api-access-h8tkm\") pod \"machine-config-controller-84d6567774-n7svq\" (UID: \"914a7337-c621-43a6-8294-50390f768e28\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n7svq" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925376 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0c05edf1-5079-4212-ba5c-19621b2500cf-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6xjgb\" (UID: \"0c05edf1-5079-4212-ba5c-19621b2500cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6xjgb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925425 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8339464b-c883-44d7-95eb-57c32689e91b-config-volume\") pod \"collect-profiles-29564130-4tdt6\" (UID: \"8339464b-c883-44d7-95eb-57c32689e91b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-4tdt6" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925453 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c59c714a-1145-4630-9d30-24d15369e2b6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4vq9f\" (UID: \"c59c714a-1145-4630-9d30-24d15369e2b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vq9f" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925514 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3733dd99-82f2-4602-b0e2-ece3c16cd446-trusted-ca-bundle\") pod \"console-f9d7485db-qbzqg\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925576 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59vks\" (UniqueName: \"kubernetes.io/projected/b2dbc3c5-555b-4b76-af97-bbe1de318efe-kube-api-access-59vks\") pod \"ingress-canary-2x748\" (UID: \"b2dbc3c5-555b-4b76-af97-bbe1de318efe\") " pod="openshift-ingress-canary/ingress-canary-2x748" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925618 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/363b3949-b8a3-4fd4-a13d-281d29c61822-etcd-client\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925643 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/53d14281-ca5f-4420-a56f-a9fd192c7e58-srv-cert\") pod \"catalog-operator-68c6474976-5r8lf\" (UID: \"53d14281-ca5f-4420-a56f-a9fd192c7e58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5r8lf" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925675 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k779w\" (UniqueName: \"kubernetes.io/projected/198822b4-2fd5-4225-bfe8-f233eaf3d8fc-kube-api-access-k779w\") pod \"cluster-samples-operator-665b6dd947-gdd4w\" (UID: \"198822b4-2fd5-4225-bfe8-f233eaf3d8fc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdd4w" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925700 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/198822b4-2fd5-4225-bfe8-f233eaf3d8fc-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gdd4w\" (UID: \"198822b4-2fd5-4225-bfe8-f233eaf3d8fc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdd4w" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925723 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9bfd6550-6dba-4e2c-9b51-31134c8afa90-certs\") pod \"machine-config-server-vgrlm\" (UID: \"9bfd6550-6dba-4e2c-9b51-31134c8afa90\") " pod="openshift-machine-config-operator/machine-config-server-vgrlm" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925751 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/df6fa2da-47ec-4a8f-b12b-59c640e0a361-machine-approver-tls\") pod \"machine-approver-56656f9798-rksxf\" (UID: \"df6fa2da-47ec-4a8f-b12b-59c640e0a361\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rksxf" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925776 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4zn8\" (UniqueName: \"kubernetes.io/projected/df6fa2da-47ec-4a8f-b12b-59c640e0a361-kube-api-access-f4zn8\") pod \"machine-approver-56656f9798-rksxf\" (UID: \"df6fa2da-47ec-4a8f-b12b-59c640e0a361\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rksxf" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925815 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9rch\" (UniqueName: \"kubernetes.io/projected/e00547a5-b2bd-495e-9d84-d6d6162bf42b-kube-api-access-r9rch\") pod \"olm-operator-6b444d44fb-5ljxk\" (UID: \"e00547a5-b2bd-495e-9d84-d6d6162bf42b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5ljxk" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925838 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/53d14281-ca5f-4420-a56f-a9fd192c7e58-profile-collector-cert\") pod \"catalog-operator-68c6474976-5r8lf\" (UID: \"53d14281-ca5f-4420-a56f-a9fd192c7e58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5r8lf" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925880 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7e889bb6-3e2c-4131-a1f7-bca28b996cd3-signing-key\") pod \"service-ca-9c57cc56f-ff562\" (UID: \"7e889bb6-3e2c-4131-a1f7-bca28b996cd3\") " pod="openshift-service-ca/service-ca-9c57cc56f-ff562" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925903 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7e889bb6-3e2c-4131-a1f7-bca28b996cd3-signing-cabundle\") pod \"service-ca-9c57cc56f-ff562\" (UID: \"7e889bb6-3e2c-4131-a1f7-bca28b996cd3\") " pod="openshift-service-ca/service-ca-9c57cc56f-ff562" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925931 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09cb0a8a-4464-4d3c-93a4-0d933d0fff77-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gxl5n\" (UID: \"09cb0a8a-4464-4d3c-93a4-0d933d0fff77\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gxl5n" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925957 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c05edf1-5079-4212-ba5c-19621b2500cf-config\") pod \"machine-api-operator-5694c8668f-6xjgb\" (UID: \"0c05edf1-5079-4212-ba5c-19621b2500cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6xjgb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.925994 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09cb0a8a-4464-4d3c-93a4-0d933d0fff77-config\") pod \"kube-controller-manager-operator-78b949d7b-gxl5n\" (UID: \"09cb0a8a-4464-4d3c-93a4-0d933d0fff77\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gxl5n" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926021 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/405ef230-1a28-4605-af3a-d3c7242943ce-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dshv6\" (UID: \"405ef230-1a28-4605-af3a-d3c7242943ce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dshv6" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926046 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f9e19eb-eadb-478e-b46d-f717a7a7c3de-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-b7p8b\" (UID: \"5f9e19eb-eadb-478e-b46d-f717a7a7c3de\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7p8b" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926093 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec637495-0e89-4ff0-9f59-2079144aa380-serving-cert\") pod \"authentication-operator-69f744f599-75kz8\" (UID: \"ec637495-0e89-4ff0-9f59-2079144aa380\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kz8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926120 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc8kl\" (UniqueName: \"kubernetes.io/projected/6f523e3f-cda8-49a8-871f-06d20ce5834e-kube-api-access-xc8kl\") pod \"multus-admission-controller-857f4d67dd-x2rv5\" (UID: \"6f523e3f-cda8-49a8-871f-06d20ce5834e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x2rv5" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926144 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr22m\" (UniqueName: \"kubernetes.io/projected/53d14281-ca5f-4420-a56f-a9fd192c7e58-kube-api-access-qr22m\") pod \"catalog-operator-68c6474976-5r8lf\" (UID: \"53d14281-ca5f-4420-a56f-a9fd192c7e58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5r8lf" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926171 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d651c778-1304-44b3-b21a-4c3ba2158db4-webhook-cert\") pod \"packageserver-d55dfcdfc-rgwf8\" (UID: \"d651c778-1304-44b3-b21a-4c3ba2158db4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rgwf8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926199 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df6fa2da-47ec-4a8f-b12b-59c640e0a361-config\") pod \"machine-approver-56656f9798-rksxf\" (UID: \"df6fa2da-47ec-4a8f-b12b-59c640e0a361\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rksxf" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926225 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zthn2\" (UniqueName: \"kubernetes.io/projected/363b3949-b8a3-4fd4-a13d-281d29c61822-kube-api-access-zthn2\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926251 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pgv2\" (UniqueName: \"kubernetes.io/projected/9bfd6550-6dba-4e2c-9b51-31134c8afa90-kube-api-access-9pgv2\") pod \"machine-config-server-vgrlm\" (UID: \"9bfd6550-6dba-4e2c-9b51-31134c8afa90\") " pod="openshift-machine-config-operator/machine-config-server-vgrlm" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926277 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvm22\" (UniqueName: \"kubernetes.io/projected/d651c778-1304-44b3-b21a-4c3ba2158db4-kube-api-access-gvm22\") pod \"packageserver-d55dfcdfc-rgwf8\" (UID: \"d651c778-1304-44b3-b21a-4c3ba2158db4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rgwf8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926318 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/363b3949-b8a3-4fd4-a13d-281d29c61822-config\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926343 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/94443ebd-69c4-4f6b-90a6-13cd2da51741-stats-auth\") pod \"router-default-5444994796-xk9kb\" (UID: \"94443ebd-69c4-4f6b-90a6-13cd2da51741\") " pod="openshift-ingress/router-default-5444994796-xk9kb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926368 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9chtn\" (UniqueName: \"kubernetes.io/projected/05a01af8-3578-44e1-8398-00e57ae2c4c2-kube-api-access-9chtn\") pod \"kube-storage-version-migrator-operator-b67b599dd-m9cb8\" (UID: \"05a01af8-3578-44e1-8398-00e57ae2c4c2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9cb8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926414 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3733dd99-82f2-4602-b0e2-ece3c16cd446-oauth-serving-cert\") pod \"console-f9d7485db-qbzqg\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926423 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5744ac11-6c36-4634-903e-298dc7b5ce45-ca-trust-extracted\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926542 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/363b3949-b8a3-4fd4-a13d-281d29c61822-image-import-ca\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926602 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3733dd99-82f2-4602-b0e2-ece3c16cd446-console-config\") pod \"console-f9d7485db-qbzqg\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926637 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e4224061-726c-4bee-84d0-9b1dfddcdaa4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-r4wjj\" (UID: \"e4224061-726c-4bee-84d0-9b1dfddcdaa4\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4wjj" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926703 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6eeff11-0e98-48f5-a868-a7016d57be14-serving-cert\") pod \"route-controller-manager-6576b87f9c-kn2g2\" (UID: \"a6eeff11-0e98-48f5-a868-a7016d57be14\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926733 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c59c714a-1145-4630-9d30-24d15369e2b6-config\") pod \"kube-apiserver-operator-766d6c64bb-4vq9f\" (UID: \"c59c714a-1145-4630-9d30-24d15369e2b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vq9f" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926759 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c59c714a-1145-4630-9d30-24d15369e2b6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4vq9f\" (UID: \"c59c714a-1145-4630-9d30-24d15369e2b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vq9f" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926788 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05a01af8-3578-44e1-8398-00e57ae2c4c2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-m9cb8\" (UID: \"05a01af8-3578-44e1-8398-00e57ae2c4c2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9cb8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926817 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f3bc2762-13c0-4c5b-b3c9-af4f68e1858e-audit-dir\") pod \"apiserver-7bbb656c7d-nb9v4\" (UID: \"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926822 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5744ac11-6c36-4634-903e-298dc7b5ce45-ca-trust-extracted\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926849 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7ftd\" (UniqueName: \"kubernetes.io/projected/854ec8b0-a321-4bcb-9327-96742fec3f31-kube-api-access-p7ftd\") pod \"auto-csr-approver-29564138-jzn9j\" (UID: \"854ec8b0-a321-4bcb-9327-96742fec3f31\") " pod="openshift-infra/auto-csr-approver-29564138-jzn9j" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926896 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c38fdccb-264a-4d02-9f5c-19140a3df2f2-serving-cert\") pod \"console-operator-58897d9998-9wfks\" (UID: \"c38fdccb-264a-4d02-9f5c-19140a3df2f2\") " pod="openshift-console-operator/console-operator-58897d9998-9wfks" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926924 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28cd9598-75cc-4316-a078-c4369354b5af-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cr7hv\" (UID: \"28cd9598-75cc-4316-a078-c4369354b5af\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr7hv" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.926953 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f3bc2762-13c0-4c5b-b3c9-af4f68e1858e-audit-policies\") pod \"apiserver-7bbb656c7d-nb9v4\" (UID: \"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.927041 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k8gk\" (UniqueName: \"kubernetes.io/projected/102d31d1-4cba-46d5-8f36-727dd4379b90-kube-api-access-4k8gk\") pod \"migrator-59844c95c7-4qhl8\" (UID: \"102d31d1-4cba-46d5-8f36-727dd4379b90\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4qhl8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.927070 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3733dd99-82f2-4602-b0e2-ece3c16cd446-console-serving-cert\") pod \"console-f9d7485db-qbzqg\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.927098 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75qfg\" (UniqueName: \"kubernetes.io/projected/3733dd99-82f2-4602-b0e2-ece3c16cd446-kube-api-access-75qfg\") pod \"console-f9d7485db-qbzqg\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.927121 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec637495-0e89-4ff0-9f59-2079144aa380-config\") pod \"authentication-operator-69f744f599-75kz8\" (UID: \"ec637495-0e89-4ff0-9f59-2079144aa380\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kz8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.927147 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/cb247c8f-8684-477b-8c7a-4a222474a497-mountpoint-dir\") pod \"csi-hostpathplugin-4bxnr\" (UID: \"cb247c8f-8684-477b-8c7a-4a222474a497\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.927183 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5744ac11-6c36-4634-903e-298dc7b5ce45-registry-certificates\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.927196 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec637495-0e89-4ff0-9f59-2079144aa380-service-ca-bundle\") pod \"authentication-operator-69f744f599-75kz8\" (UID: \"ec637495-0e89-4ff0-9f59-2079144aa380\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kz8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.927206 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05a01af8-3578-44e1-8398-00e57ae2c4c2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-m9cb8\" (UID: \"05a01af8-3578-44e1-8398-00e57ae2c4c2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9cb8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.927250 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94443ebd-69c4-4f6b-90a6-13cd2da51741-metrics-certs\") pod \"router-default-5444994796-xk9kb\" (UID: \"94443ebd-69c4-4f6b-90a6-13cd2da51741\") " pod="openshift-ingress/router-default-5444994796-xk9kb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.927283 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec637495-0e89-4ff0-9f59-2079144aa380-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-75kz8\" (UID: \"ec637495-0e89-4ff0-9f59-2079144aa380\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kz8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.927312 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnz5q\" (UniqueName: \"kubernetes.io/projected/b0297eb3-0438-4db1-97bd-405779a01255-kube-api-access-gnz5q\") pod \"control-plane-machine-set-operator-78cbb6b69f-gx9sb\" (UID: \"b0297eb3-0438-4db1-97bd-405779a01255\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gx9sb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.927354 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09cb0a8a-4464-4d3c-93a4-0d933d0fff77-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gxl5n\" (UID: \"09cb0a8a-4464-4d3c-93a4-0d933d0fff77\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gxl5n" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.927397 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6f523e3f-cda8-49a8-871f-06d20ce5834e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-x2rv5\" (UID: \"6f523e3f-cda8-49a8-871f-06d20ce5834e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x2rv5" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.927421 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10213877-0b4a-463d-b286-f2a0fb2e3fd6-config-volume\") pod \"dns-default-8286f\" (UID: \"10213877-0b4a-463d-b286-f2a0fb2e3fd6\") " pod="openshift-dns/dns-default-8286f" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.927466 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90a9c50c-4a8b-4731-9ee8-addc5dfb22f7-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vpp52\" (UID: \"90a9c50c-4a8b-4731-9ee8-addc5dfb22f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vpp52" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.927490 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94443ebd-69c4-4f6b-90a6-13cd2da51741-service-ca-bundle\") pod \"router-default-5444994796-xk9kb\" (UID: \"94443ebd-69c4-4f6b-90a6-13cd2da51741\") " pod="openshift-ingress/router-default-5444994796-xk9kb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.927534 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c38fdccb-264a-4d02-9f5c-19140a3df2f2-trusted-ca\") pod \"console-operator-58897d9998-9wfks\" (UID: \"c38fdccb-264a-4d02-9f5c-19140a3df2f2\") " pod="openshift-console-operator/console-operator-58897d9998-9wfks" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.927956 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5744ac11-6c36-4634-903e-298dc7b5ce45-trusted-ca\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.928383 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/363b3949-b8a3-4fd4-a13d-281d29c61822-image-import-ca\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.929104 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3733dd99-82f2-4602-b0e2-ece3c16cd446-console-config\") pod \"console-f9d7485db-qbzqg\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.929319 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3bc2762-13c0-4c5b-b3c9-af4f68e1858e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nb9v4\" (UID: \"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.929365 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l5xd\" (UniqueName: \"kubernetes.io/projected/1a9512a6-dc1a-4c12-b080-55aeb712b1f0-kube-api-access-9l5xd\") pod \"ingress-operator-5b745b69d9-ljcg6\" (UID: \"1a9512a6-dc1a-4c12-b080-55aeb712b1f0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ljcg6" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.929401 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5744ac11-6c36-4634-903e-298dc7b5ce45-installation-pull-secrets\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.929421 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/363b3949-b8a3-4fd4-a13d-281d29c61822-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.929441 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f3bc2762-13c0-4c5b-b3c9-af4f68e1858e-etcd-client\") pod \"apiserver-7bbb656c7d-nb9v4\" (UID: \"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.929470 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1a9512a6-dc1a-4c12-b080-55aeb712b1f0-trusted-ca\") pod \"ingress-operator-5b745b69d9-ljcg6\" (UID: \"1a9512a6-dc1a-4c12-b080-55aeb712b1f0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ljcg6" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.929627 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/cb247c8f-8684-477b-8c7a-4a222474a497-plugins-dir\") pod \"csi-hostpathplugin-4bxnr\" (UID: \"cb247c8f-8684-477b-8c7a-4a222474a497\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.929653 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/cb247c8f-8684-477b-8c7a-4a222474a497-csi-data-dir\") pod \"csi-hostpathplugin-4bxnr\" (UID: \"cb247c8f-8684-477b-8c7a-4a222474a497\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.929679 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhvh4\" (UniqueName: \"kubernetes.io/projected/7e889bb6-3e2c-4131-a1f7-bca28b996cd3-kube-api-access-zhvh4\") pod \"service-ca-9c57cc56f-ff562\" (UID: \"7e889bb6-3e2c-4131-a1f7-bca28b996cd3\") " pod="openshift-service-ca/service-ca-9c57cc56f-ff562" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.929697 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfj4j\" (UniqueName: \"kubernetes.io/projected/ecd7ad5d-edd9-4ae4-8b42-cc17562d6182-kube-api-access-pfj4j\") pod \"service-ca-operator-777779d784-m4tfr\" (UID: \"ecd7ad5d-edd9-4ae4-8b42-cc17562d6182\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4tfr" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.929718 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d651c778-1304-44b3-b21a-4c3ba2158db4-apiservice-cert\") pod \"packageserver-d55dfcdfc-rgwf8\" (UID: \"d651c778-1304-44b3-b21a-4c3ba2158db4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rgwf8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.929738 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1a9512a6-dc1a-4c12-b080-55aeb712b1f0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ljcg6\" (UID: \"1a9512a6-dc1a-4c12-b080-55aeb712b1f0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ljcg6" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.929768 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/405ef230-1a28-4605-af3a-d3c7242943ce-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dshv6\" (UID: \"405ef230-1a28-4605-af3a-d3c7242943ce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dshv6" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.929789 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/363b3949-b8a3-4fd4-a13d-281d29c61822-node-pullsecrets\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.929811 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/df6fa2da-47ec-4a8f-b12b-59c640e0a361-auth-proxy-config\") pod \"machine-approver-56656f9798-rksxf\" (UID: \"df6fa2da-47ec-4a8f-b12b-59c640e0a361\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rksxf" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.929831 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qm94\" (UniqueName: \"kubernetes.io/projected/5f9e19eb-eadb-478e-b46d-f717a7a7c3de-kube-api-access-7qm94\") pod \"package-server-manager-789f6589d5-b7p8b\" (UID: \"5f9e19eb-eadb-478e-b46d-f717a7a7c3de\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7p8b" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.929853 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90a9c50c-4a8b-4731-9ee8-addc5dfb22f7-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vpp52\" (UID: \"90a9c50c-4a8b-4731-9ee8-addc5dfb22f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vpp52" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.929897 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c38fdccb-264a-4d02-9f5c-19140a3df2f2-config\") pod \"console-operator-58897d9998-9wfks\" (UID: \"c38fdccb-264a-4d02-9f5c-19140a3df2f2\") " pod="openshift-console-operator/console-operator-58897d9998-9wfks" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.929923 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/363b3949-b8a3-4fd4-a13d-281d29c61822-encryption-config\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.929943 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3bc2762-13c0-4c5b-b3c9-af4f68e1858e-serving-cert\") pod \"apiserver-7bbb656c7d-nb9v4\" (UID: \"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.929963 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/363b3949-b8a3-4fd4-a13d-281d29c61822-serving-cert\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.929980 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/363b3949-b8a3-4fd4-a13d-281d29c61822-audit-dir\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.929998 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/10213877-0b4a-463d-b286-f2a0fb2e3fd6-metrics-tls\") pod \"dns-default-8286f\" (UID: \"10213877-0b4a-463d-b286-f2a0fb2e3fd6\") " pod="openshift-dns/dns-default-8286f" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.930255 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28cd9598-75cc-4316-a078-c4369354b5af-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cr7hv\" (UID: \"28cd9598-75cc-4316-a078-c4369354b5af\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr7hv" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.930802 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5744ac11-6c36-4634-903e-298dc7b5ce45-registry-certificates\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.930913 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec637495-0e89-4ff0-9f59-2079144aa380-config\") pod \"authentication-operator-69f744f599-75kz8\" (UID: \"ec637495-0e89-4ff0-9f59-2079144aa380\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kz8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.932155 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3733dd99-82f2-4602-b0e2-ece3c16cd446-trusted-ca-bundle\") pod \"console-f9d7485db-qbzqg\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.932837 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.932878 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/363b3949-b8a3-4fd4-a13d-281d29c61822-etcd-serving-ca\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.932906 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e00547a5-b2bd-495e-9d84-d6d6162bf42b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5ljxk\" (UID: \"e00547a5-b2bd-495e-9d84-d6d6162bf42b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5ljxk" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.932934 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cb247c8f-8684-477b-8c7a-4a222474a497-registration-dir\") pod \"csi-hostpathplugin-4bxnr\" (UID: \"cb247c8f-8684-477b-8c7a-4a222474a497\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.932982 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8339464b-c883-44d7-95eb-57c32689e91b-secret-volume\") pod \"collect-profiles-29564130-4tdt6\" (UID: \"8339464b-c883-44d7-95eb-57c32689e91b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-4tdt6" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.933704 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/363b3949-b8a3-4fd4-a13d-281d29c61822-etcd-serving-ca\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.934945 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09cb0a8a-4464-4d3c-93a4-0d933d0fff77-config\") pod \"kube-controller-manager-operator-78b949d7b-gxl5n\" (UID: \"09cb0a8a-4464-4d3c-93a4-0d933d0fff77\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gxl5n" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.935308 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2dbc3c5-555b-4b76-af97-bbe1de318efe-cert\") pod \"ingress-canary-2x748\" (UID: \"b2dbc3c5-555b-4b76-af97-bbe1de318efe\") " pod="openshift-ingress-canary/ingress-canary-2x748" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.935353 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3733dd99-82f2-4602-b0e2-ece3c16cd446-service-ca\") pod \"console-f9d7485db-qbzqg\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.935378 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/914a7337-c621-43a6-8294-50390f768e28-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-n7svq\" (UID: \"914a7337-c621-43a6-8294-50390f768e28\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n7svq" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.935401 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f3bc2762-13c0-4c5b-b3c9-af4f68e1858e-encryption-config\") pod \"apiserver-7bbb656c7d-nb9v4\" (UID: \"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.935440 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a9512a6-dc1a-4c12-b080-55aeb712b1f0-metrics-tls\") pod \"ingress-operator-5b745b69d9-ljcg6\" (UID: \"1a9512a6-dc1a-4c12-b080-55aeb712b1f0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ljcg6" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.935783 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/363b3949-b8a3-4fd4-a13d-281d29c61822-config\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.936108 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3733dd99-82f2-4602-b0e2-ece3c16cd446-service-ca\") pod \"console-f9d7485db-qbzqg\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.936882 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c05edf1-5079-4212-ba5c-19621b2500cf-config\") pod \"machine-api-operator-5694c8668f-6xjgb\" (UID: \"0c05edf1-5079-4212-ba5c-19621b2500cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6xjgb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.937399 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/363b3949-b8a3-4fd4-a13d-281d29c61822-encryption-config\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.937713 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94443ebd-69c4-4f6b-90a6-13cd2da51741-service-ca-bundle\") pod \"router-default-5444994796-xk9kb\" (UID: \"94443ebd-69c4-4f6b-90a6-13cd2da51741\") " pod="openshift-ingress/router-default-5444994796-xk9kb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.938037 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90a9c50c-4a8b-4731-9ee8-addc5dfb22f7-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vpp52\" (UID: \"90a9c50c-4a8b-4731-9ee8-addc5dfb22f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vpp52" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.938144 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/914a7337-c621-43a6-8294-50390f768e28-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-n7svq\" (UID: \"914a7337-c621-43a6-8294-50390f768e28\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n7svq" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.938197 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c38fdccb-264a-4d02-9f5c-19140a3df2f2-config\") pod \"console-operator-58897d9998-9wfks\" (UID: \"c38fdccb-264a-4d02-9f5c-19140a3df2f2\") " pod="openshift-console-operator/console-operator-58897d9998-9wfks" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.938274 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/363b3949-b8a3-4fd4-a13d-281d29c61822-audit-dir\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.938420 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/363b3949-b8a3-4fd4-a13d-281d29c61822-node-pullsecrets\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.953658 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/363b3949-b8a3-4fd4-a13d-281d29c61822-serving-cert\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.954001 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec637495-0e89-4ff0-9f59-2079144aa380-serving-cert\") pod \"authentication-operator-69f744f599-75kz8\" (UID: \"ec637495-0e89-4ff0-9f59-2079144aa380\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kz8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.954056 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3733dd99-82f2-4602-b0e2-ece3c16cd446-console-serving-cert\") pod \"console-f9d7485db-qbzqg\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.954421 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c38fdccb-264a-4d02-9f5c-19140a3df2f2-trusted-ca\") pod \"console-operator-58897d9998-9wfks\" (UID: \"c38fdccb-264a-4d02-9f5c-19140a3df2f2\") " pod="openshift-console-operator/console-operator-58897d9998-9wfks" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.954997 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/94443ebd-69c4-4f6b-90a6-13cd2da51741-stats-auth\") pod \"router-default-5444994796-xk9kb\" (UID: \"94443ebd-69c4-4f6b-90a6-13cd2da51741\") " pod="openshift-ingress/router-default-5444994796-xk9kb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.955120 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c38fdccb-264a-4d02-9f5c-19140a3df2f2-serving-cert\") pod \"console-operator-58897d9998-9wfks\" (UID: \"c38fdccb-264a-4d02-9f5c-19140a3df2f2\") " pod="openshift-console-operator/console-operator-58897d9998-9wfks" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.955467 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94443ebd-69c4-4f6b-90a6-13cd2da51741-metrics-certs\") pod \"router-default-5444994796-xk9kb\" (UID: \"94443ebd-69c4-4f6b-90a6-13cd2da51741\") " pod="openshift-ingress/router-default-5444994796-xk9kb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.955585 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/363b3949-b8a3-4fd4-a13d-281d29c61822-etcd-client\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.955795 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/198822b4-2fd5-4225-bfe8-f233eaf3d8fc-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gdd4w\" (UID: \"198822b4-2fd5-4225-bfe8-f233eaf3d8fc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdd4w" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.956009 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6eeff11-0e98-48f5-a868-a7016d57be14-serving-cert\") pod \"route-controller-manager-6576b87f9c-kn2g2\" (UID: \"a6eeff11-0e98-48f5-a868-a7016d57be14\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.958119 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec637495-0e89-4ff0-9f59-2079144aa380-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-75kz8\" (UID: \"ec637495-0e89-4ff0-9f59-2079144aa380\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kz8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.958583 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/363b3949-b8a3-4fd4-a13d-281d29c61822-audit\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.959297 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0c05edf1-5079-4212-ba5c-19621b2500cf-images\") pod \"machine-api-operator-5694c8668f-6xjgb\" (UID: \"0c05edf1-5079-4212-ba5c-19621b2500cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6xjgb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.959479 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5f87\" (UniqueName: \"kubernetes.io/projected/f7245cec-cee9-4aa5-8087-81ad2f450977-kube-api-access-c5f87\") pod \"downloads-7954f5f757-498px\" (UID: \"f7245cec-cee9-4aa5-8087-81ad2f450977\") " pod="openshift-console/downloads-7954f5f757-498px" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.960318 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/363b3949-b8a3-4fd4-a13d-281d29c61822-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.960717 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0c05edf1-5079-4212-ba5c-19621b2500cf-images\") pod \"machine-api-operator-5694c8668f-6xjgb\" (UID: \"0c05edf1-5079-4212-ba5c-19621b2500cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6xjgb" Mar 18 15:38:51 crc kubenswrapper[4696]: E0318 15:38:51.960763 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:52.460733907 +0000 UTC m=+175.466908113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.960801 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zspsz\" (UniqueName: \"kubernetes.io/projected/90a9c50c-4a8b-4731-9ee8-addc5dfb22f7-kube-api-access-zspsz\") pod \"openshift-apiserver-operator-796bbdcf4f-vpp52\" (UID: \"90a9c50c-4a8b-4731-9ee8-addc5dfb22f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vpp52" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.960898 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3733dd99-82f2-4602-b0e2-ece3c16cd446-console-oauth-config\") pod \"console-f9d7485db-qbzqg\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.961414 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0c05edf1-5079-4212-ba5c-19621b2500cf-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6xjgb\" (UID: \"0c05edf1-5079-4212-ba5c-19621b2500cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6xjgb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.961484 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/363b3949-b8a3-4fd4-a13d-281d29c61822-audit\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.961741 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmj4r\" (UniqueName: \"kubernetes.io/projected/5744ac11-6c36-4634-903e-298dc7b5ce45-kube-api-access-dmj4r\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.961786 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pk8t\" (UniqueName: \"kubernetes.io/projected/0c05edf1-5079-4212-ba5c-19621b2500cf-kube-api-access-6pk8t\") pod \"machine-api-operator-5694c8668f-6xjgb\" (UID: \"0c05edf1-5079-4212-ba5c-19621b2500cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6xjgb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.962281 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t98l6\" (UniqueName: \"kubernetes.io/projected/c38fdccb-264a-4d02-9f5c-19140a3df2f2-kube-api-access-t98l6\") pod \"console-operator-58897d9998-9wfks\" (UID: \"c38fdccb-264a-4d02-9f5c-19140a3df2f2\") " pod="openshift-console-operator/console-operator-58897d9998-9wfks" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.962888 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6eeff11-0e98-48f5-a868-a7016d57be14-config\") pod \"route-controller-manager-6576b87f9c-kn2g2\" (UID: \"a6eeff11-0e98-48f5-a868-a7016d57be14\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.962972 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6eeff11-0e98-48f5-a868-a7016d57be14-client-ca\") pod \"route-controller-manager-6576b87f9c-kn2g2\" (UID: \"a6eeff11-0e98-48f5-a868-a7016d57be14\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.963015 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecd7ad5d-edd9-4ae4-8b42-cc17562d6182-config\") pod \"service-ca-operator-777779d784-m4tfr\" (UID: \"ecd7ad5d-edd9-4ae4-8b42-cc17562d6182\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4tfr" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.963098 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cb247c8f-8684-477b-8c7a-4a222474a497-socket-dir\") pod \"csi-hostpathplugin-4bxnr\" (UID: \"cb247c8f-8684-477b-8c7a-4a222474a497\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.963142 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtvvw\" (UniqueName: \"kubernetes.io/projected/55f95b82-6e61-4b39-a2e6-6685d37d61e8-kube-api-access-wtvvw\") pod \"dns-operator-744455d44c-69fs4\" (UID: \"55f95b82-6e61-4b39-a2e6-6685d37d61e8\") " pod="openshift-dns-operator/dns-operator-744455d44c-69fs4" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.963177 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/28cd9598-75cc-4316-a078-c4369354b5af-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cr7hv\" (UID: \"28cd9598-75cc-4316-a078-c4369354b5af\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr7hv" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.963272 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/405ef230-1a28-4605-af3a-d3c7242943ce-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dshv6\" (UID: \"405ef230-1a28-4605-af3a-d3c7242943ce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dshv6" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.963317 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/28cd9598-75cc-4316-a078-c4369354b5af-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cr7hv\" (UID: \"28cd9598-75cc-4316-a078-c4369354b5af\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr7hv" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.963396 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/94443ebd-69c4-4f6b-90a6-13cd2da51741-default-certificate\") pod \"router-default-5444994796-xk9kb\" (UID: \"94443ebd-69c4-4f6b-90a6-13cd2da51741\") " pod="openshift-ingress/router-default-5444994796-xk9kb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.963437 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/914a7337-c621-43a6-8294-50390f768e28-proxy-tls\") pod \"machine-config-controller-84d6567774-n7svq\" (UID: \"914a7337-c621-43a6-8294-50390f768e28\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n7svq" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.963474 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxhmx\" (UniqueName: \"kubernetes.io/projected/f3bc2762-13c0-4c5b-b3c9-af4f68e1858e-kube-api-access-cxhmx\") pod \"apiserver-7bbb656c7d-nb9v4\" (UID: \"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.963508 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv8hv\" (UniqueName: \"kubernetes.io/projected/8339464b-c883-44d7-95eb-57c32689e91b-kube-api-access-qv8hv\") pod \"collect-profiles-29564130-4tdt6\" (UID: \"8339464b-c883-44d7-95eb-57c32689e91b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-4tdt6" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.965589 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6eeff11-0e98-48f5-a868-a7016d57be14-config\") pod \"route-controller-manager-6576b87f9c-kn2g2\" (UID: \"a6eeff11-0e98-48f5-a868-a7016d57be14\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.965928 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5744ac11-6c36-4634-903e-298dc7b5ce45-registry-tls\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.966740 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwglp\" (UniqueName: \"kubernetes.io/projected/a6eeff11-0e98-48f5-a868-a7016d57be14-kube-api-access-jwglp\") pod \"route-controller-manager-6576b87f9c-kn2g2\" (UID: \"a6eeff11-0e98-48f5-a868-a7016d57be14\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.968087 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/55f95b82-6e61-4b39-a2e6-6685d37d61e8-metrics-tls\") pod \"dns-operator-744455d44c-69fs4\" (UID: \"55f95b82-6e61-4b39-a2e6-6685d37d61e8\") " pod="openshift-dns-operator/dns-operator-744455d44c-69fs4" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.968132 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f3bc2762-13c0-4c5b-b3c9-af4f68e1858e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nb9v4\" (UID: \"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.968164 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e00547a5-b2bd-495e-9d84-d6d6162bf42b-srv-cert\") pod \"olm-operator-6b444d44fb-5ljxk\" (UID: \"e00547a5-b2bd-495e-9d84-d6d6162bf42b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5ljxk" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.968212 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d651c778-1304-44b3-b21a-4c3ba2158db4-tmpfs\") pod \"packageserver-d55dfcdfc-rgwf8\" (UID: \"d651c778-1304-44b3-b21a-4c3ba2158db4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rgwf8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.968268 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecd7ad5d-edd9-4ae4-8b42-cc17562d6182-serving-cert\") pod \"service-ca-operator-777779d784-m4tfr\" (UID: \"ecd7ad5d-edd9-4ae4-8b42-cc17562d6182\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4tfr" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.966884 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6eeff11-0e98-48f5-a868-a7016d57be14-client-ca\") pod \"route-controller-manager-6576b87f9c-kn2g2\" (UID: \"a6eeff11-0e98-48f5-a868-a7016d57be14\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.968309 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2bmg\" (UniqueName: \"kubernetes.io/projected/e4224061-726c-4bee-84d0-9b1dfddcdaa4-kube-api-access-f2bmg\") pod \"marketplace-operator-79b997595-r4wjj\" (UID: \"e4224061-726c-4bee-84d0-9b1dfddcdaa4\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4wjj" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.968337 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b0297eb3-0438-4db1-97bd-405779a01255-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gx9sb\" (UID: \"b0297eb3-0438-4db1-97bd-405779a01255\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gx9sb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.968366 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgprf\" (UniqueName: \"kubernetes.io/projected/10213877-0b4a-463d-b286-f2a0fb2e3fd6-kube-api-access-pgprf\") pod \"dns-default-8286f\" (UID: \"10213877-0b4a-463d-b286-f2a0fb2e3fd6\") " pod="openshift-dns/dns-default-8286f" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.968387 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f664j\" (UniqueName: \"kubernetes.io/projected/cb247c8f-8684-477b-8c7a-4a222474a497-kube-api-access-f664j\") pod \"csi-hostpathplugin-4bxnr\" (UID: \"cb247c8f-8684-477b-8c7a-4a222474a497\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.968613 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e4224061-726c-4bee-84d0-9b1dfddcdaa4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-r4wjj\" (UID: \"e4224061-726c-4bee-84d0-9b1dfddcdaa4\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4wjj" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.968646 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc6gb\" (UniqueName: \"kubernetes.io/projected/ec637495-0e89-4ff0-9f59-2079144aa380-kube-api-access-tc6gb\") pod \"authentication-operator-69f744f599-75kz8\" (UID: \"ec637495-0e89-4ff0-9f59-2079144aa380\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kz8" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.968671 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9bfd6550-6dba-4e2c-9b51-31134c8afa90-node-bootstrap-token\") pod \"machine-config-server-vgrlm\" (UID: \"9bfd6550-6dba-4e2c-9b51-31134c8afa90\") " pod="openshift-machine-config-operator/machine-config-server-vgrlm" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.969902 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvmx5" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.972210 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/55f95b82-6e61-4b39-a2e6-6685d37d61e8-metrics-tls\") pod \"dns-operator-744455d44c-69fs4\" (UID: \"55f95b82-6e61-4b39-a2e6-6685d37d61e8\") " pod="openshift-dns-operator/dns-operator-744455d44c-69fs4" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.973313 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09cb0a8a-4464-4d3c-93a4-0d933d0fff77-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gxl5n\" (UID: \"09cb0a8a-4464-4d3c-93a4-0d933d0fff77\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gxl5n" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.973668 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/94443ebd-69c4-4f6b-90a6-13cd2da51741-default-certificate\") pod \"router-default-5444994796-xk9kb\" (UID: \"94443ebd-69c4-4f6b-90a6-13cd2da51741\") " pod="openshift-ingress/router-default-5444994796-xk9kb" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.974573 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g5mnq" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.975627 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/28cd9598-75cc-4316-a078-c4369354b5af-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cr7hv\" (UID: \"28cd9598-75cc-4316-a078-c4369354b5af\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr7hv" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.980015 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5744ac11-6c36-4634-903e-298dc7b5ce45-bound-sa-token\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.982325 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90a9c50c-4a8b-4731-9ee8-addc5dfb22f7-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vpp52\" (UID: \"90a9c50c-4a8b-4731-9ee8-addc5dfb22f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vpp52" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.984034 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5744ac11-6c36-4634-903e-298dc7b5ce45-installation-pull-secrets\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.985028 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3733dd99-82f2-4602-b0e2-ece3c16cd446-console-oauth-config\") pod \"console-f9d7485db-qbzqg\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.986907 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/914a7337-c621-43a6-8294-50390f768e28-proxy-tls\") pod \"machine-config-controller-84d6567774-n7svq\" (UID: \"914a7337-c621-43a6-8294-50390f768e28\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n7svq" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.988252 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5744ac11-6c36-4634-903e-298dc7b5ce45-registry-tls\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:51 crc kubenswrapper[4696]: I0318 15:38:51.989138 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28d5l\" (UniqueName: \"kubernetes.io/projected/28cd9598-75cc-4316-a078-c4369354b5af-kube-api-access-28d5l\") pod \"cluster-image-registry-operator-dc59b4c8b-cr7hv\" (UID: \"28cd9598-75cc-4316-a078-c4369354b5af\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr7hv" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.006930 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95cnx\" (UniqueName: \"kubernetes.io/projected/94443ebd-69c4-4f6b-90a6-13cd2da51741-kube-api-access-95cnx\") pod \"router-default-5444994796-xk9kb\" (UID: \"94443ebd-69c4-4f6b-90a6-13cd2da51741\") " pod="openshift-ingress/router-default-5444994796-xk9kb" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.024909 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8tkm\" (UniqueName: \"kubernetes.io/projected/914a7337-c621-43a6-8294-50390f768e28-kube-api-access-h8tkm\") pod \"machine-config-controller-84d6567774-n7svq\" (UID: \"914a7337-c621-43a6-8294-50390f768e28\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n7svq" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.043323 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k8gk\" (UniqueName: \"kubernetes.io/projected/102d31d1-4cba-46d5-8f36-727dd4379b90-kube-api-access-4k8gk\") pod \"migrator-59844c95c7-4qhl8\" (UID: \"102d31d1-4cba-46d5-8f36-727dd4379b90\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4qhl8" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.060619 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09cb0a8a-4464-4d3c-93a4-0d933d0fff77-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gxl5n\" (UID: \"09cb0a8a-4464-4d3c-93a4-0d933d0fff77\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gxl5n" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074002 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074261 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f3bc2762-13c0-4c5b-b3c9-af4f68e1858e-etcd-client\") pod \"apiserver-7bbb656c7d-nb9v4\" (UID: \"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074298 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1a9512a6-dc1a-4c12-b080-55aeb712b1f0-trusted-ca\") pod \"ingress-operator-5b745b69d9-ljcg6\" (UID: \"1a9512a6-dc1a-4c12-b080-55aeb712b1f0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ljcg6" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074325 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/cb247c8f-8684-477b-8c7a-4a222474a497-plugins-dir\") pod \"csi-hostpathplugin-4bxnr\" (UID: \"cb247c8f-8684-477b-8c7a-4a222474a497\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074352 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/cb247c8f-8684-477b-8c7a-4a222474a497-csi-data-dir\") pod \"csi-hostpathplugin-4bxnr\" (UID: \"cb247c8f-8684-477b-8c7a-4a222474a497\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074374 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1a9512a6-dc1a-4c12-b080-55aeb712b1f0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ljcg6\" (UID: \"1a9512a6-dc1a-4c12-b080-55aeb712b1f0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ljcg6" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074411 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhvh4\" (UniqueName: \"kubernetes.io/projected/7e889bb6-3e2c-4131-a1f7-bca28b996cd3-kube-api-access-zhvh4\") pod \"service-ca-9c57cc56f-ff562\" (UID: \"7e889bb6-3e2c-4131-a1f7-bca28b996cd3\") " pod="openshift-service-ca/service-ca-9c57cc56f-ff562" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074434 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfj4j\" (UniqueName: \"kubernetes.io/projected/ecd7ad5d-edd9-4ae4-8b42-cc17562d6182-kube-api-access-pfj4j\") pod \"service-ca-operator-777779d784-m4tfr\" (UID: \"ecd7ad5d-edd9-4ae4-8b42-cc17562d6182\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4tfr" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074459 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d651c778-1304-44b3-b21a-4c3ba2158db4-apiservice-cert\") pod \"packageserver-d55dfcdfc-rgwf8\" (UID: \"d651c778-1304-44b3-b21a-4c3ba2158db4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rgwf8" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074483 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/405ef230-1a28-4605-af3a-d3c7242943ce-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dshv6\" (UID: \"405ef230-1a28-4605-af3a-d3c7242943ce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dshv6" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074509 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/df6fa2da-47ec-4a8f-b12b-59c640e0a361-auth-proxy-config\") pod \"machine-approver-56656f9798-rksxf\" (UID: \"df6fa2da-47ec-4a8f-b12b-59c640e0a361\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rksxf" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074549 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qm94\" (UniqueName: \"kubernetes.io/projected/5f9e19eb-eadb-478e-b46d-f717a7a7c3de-kube-api-access-7qm94\") pod \"package-server-manager-789f6589d5-b7p8b\" (UID: \"5f9e19eb-eadb-478e-b46d-f717a7a7c3de\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7p8b" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074581 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3bc2762-13c0-4c5b-b3c9-af4f68e1858e-serving-cert\") pod \"apiserver-7bbb656c7d-nb9v4\" (UID: \"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074605 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/10213877-0b4a-463d-b286-f2a0fb2e3fd6-metrics-tls\") pod \"dns-default-8286f\" (UID: \"10213877-0b4a-463d-b286-f2a0fb2e3fd6\") " pod="openshift-dns/dns-default-8286f" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074640 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e00547a5-b2bd-495e-9d84-d6d6162bf42b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5ljxk\" (UID: \"e00547a5-b2bd-495e-9d84-d6d6162bf42b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5ljxk" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074659 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cb247c8f-8684-477b-8c7a-4a222474a497-registration-dir\") pod \"csi-hostpathplugin-4bxnr\" (UID: \"cb247c8f-8684-477b-8c7a-4a222474a497\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074681 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8339464b-c883-44d7-95eb-57c32689e91b-secret-volume\") pod \"collect-profiles-29564130-4tdt6\" (UID: \"8339464b-c883-44d7-95eb-57c32689e91b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-4tdt6" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074700 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2dbc3c5-555b-4b76-af97-bbe1de318efe-cert\") pod \"ingress-canary-2x748\" (UID: \"b2dbc3c5-555b-4b76-af97-bbe1de318efe\") " pod="openshift-ingress-canary/ingress-canary-2x748" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074718 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a9512a6-dc1a-4c12-b080-55aeb712b1f0-metrics-tls\") pod \"ingress-operator-5b745b69d9-ljcg6\" (UID: \"1a9512a6-dc1a-4c12-b080-55aeb712b1f0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ljcg6" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074749 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f3bc2762-13c0-4c5b-b3c9-af4f68e1858e-encryption-config\") pod \"apiserver-7bbb656c7d-nb9v4\" (UID: \"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074823 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cb247c8f-8684-477b-8c7a-4a222474a497-socket-dir\") pod \"csi-hostpathplugin-4bxnr\" (UID: \"cb247c8f-8684-477b-8c7a-4a222474a497\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074849 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecd7ad5d-edd9-4ae4-8b42-cc17562d6182-config\") pod \"service-ca-operator-777779d784-m4tfr\" (UID: \"ecd7ad5d-edd9-4ae4-8b42-cc17562d6182\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4tfr" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074884 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/405ef230-1a28-4605-af3a-d3c7242943ce-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dshv6\" (UID: \"405ef230-1a28-4605-af3a-d3c7242943ce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dshv6" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074909 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxhmx\" (UniqueName: \"kubernetes.io/projected/f3bc2762-13c0-4c5b-b3c9-af4f68e1858e-kube-api-access-cxhmx\") pod \"apiserver-7bbb656c7d-nb9v4\" (UID: \"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074934 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv8hv\" (UniqueName: \"kubernetes.io/projected/8339464b-c883-44d7-95eb-57c32689e91b-kube-api-access-qv8hv\") pod \"collect-profiles-29564130-4tdt6\" (UID: \"8339464b-c883-44d7-95eb-57c32689e91b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-4tdt6" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074961 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f3bc2762-13c0-4c5b-b3c9-af4f68e1858e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nb9v4\" (UID: \"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.074983 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e00547a5-b2bd-495e-9d84-d6d6162bf42b-srv-cert\") pod \"olm-operator-6b444d44fb-5ljxk\" (UID: \"e00547a5-b2bd-495e-9d84-d6d6162bf42b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5ljxk" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075019 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d651c778-1304-44b3-b21a-4c3ba2158db4-tmpfs\") pod \"packageserver-d55dfcdfc-rgwf8\" (UID: \"d651c778-1304-44b3-b21a-4c3ba2158db4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rgwf8" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075041 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecd7ad5d-edd9-4ae4-8b42-cc17562d6182-serving-cert\") pod \"service-ca-operator-777779d784-m4tfr\" (UID: \"ecd7ad5d-edd9-4ae4-8b42-cc17562d6182\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4tfr" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075078 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2bmg\" (UniqueName: \"kubernetes.io/projected/e4224061-726c-4bee-84d0-9b1dfddcdaa4-kube-api-access-f2bmg\") pod \"marketplace-operator-79b997595-r4wjj\" (UID: \"e4224061-726c-4bee-84d0-9b1dfddcdaa4\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4wjj" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075107 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b0297eb3-0438-4db1-97bd-405779a01255-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gx9sb\" (UID: \"b0297eb3-0438-4db1-97bd-405779a01255\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gx9sb" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075138 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgprf\" (UniqueName: \"kubernetes.io/projected/10213877-0b4a-463d-b286-f2a0fb2e3fd6-kube-api-access-pgprf\") pod \"dns-default-8286f\" (UID: \"10213877-0b4a-463d-b286-f2a0fb2e3fd6\") " pod="openshift-dns/dns-default-8286f" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075163 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f664j\" (UniqueName: \"kubernetes.io/projected/cb247c8f-8684-477b-8c7a-4a222474a497-kube-api-access-f664j\") pod \"csi-hostpathplugin-4bxnr\" (UID: \"cb247c8f-8684-477b-8c7a-4a222474a497\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075195 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e4224061-726c-4bee-84d0-9b1dfddcdaa4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-r4wjj\" (UID: \"e4224061-726c-4bee-84d0-9b1dfddcdaa4\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4wjj" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075217 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9bfd6550-6dba-4e2c-9b51-31134c8afa90-node-bootstrap-token\") pod \"machine-config-server-vgrlm\" (UID: \"9bfd6550-6dba-4e2c-9b51-31134c8afa90\") " pod="openshift-machine-config-operator/machine-config-server-vgrlm" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075257 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8339464b-c883-44d7-95eb-57c32689e91b-config-volume\") pod \"collect-profiles-29564130-4tdt6\" (UID: \"8339464b-c883-44d7-95eb-57c32689e91b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-4tdt6" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075288 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c59c714a-1145-4630-9d30-24d15369e2b6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4vq9f\" (UID: \"c59c714a-1145-4630-9d30-24d15369e2b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vq9f" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075313 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59vks\" (UniqueName: \"kubernetes.io/projected/b2dbc3c5-555b-4b76-af97-bbe1de318efe-kube-api-access-59vks\") pod \"ingress-canary-2x748\" (UID: \"b2dbc3c5-555b-4b76-af97-bbe1de318efe\") " pod="openshift-ingress-canary/ingress-canary-2x748" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075361 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/53d14281-ca5f-4420-a56f-a9fd192c7e58-srv-cert\") pod \"catalog-operator-68c6474976-5r8lf\" (UID: \"53d14281-ca5f-4420-a56f-a9fd192c7e58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5r8lf" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075393 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9bfd6550-6dba-4e2c-9b51-31134c8afa90-certs\") pod \"machine-config-server-vgrlm\" (UID: \"9bfd6550-6dba-4e2c-9b51-31134c8afa90\") " pod="openshift-machine-config-operator/machine-config-server-vgrlm" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075413 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/df6fa2da-47ec-4a8f-b12b-59c640e0a361-machine-approver-tls\") pod \"machine-approver-56656f9798-rksxf\" (UID: \"df6fa2da-47ec-4a8f-b12b-59c640e0a361\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rksxf" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075436 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4zn8\" (UniqueName: \"kubernetes.io/projected/df6fa2da-47ec-4a8f-b12b-59c640e0a361-kube-api-access-f4zn8\") pod \"machine-approver-56656f9798-rksxf\" (UID: \"df6fa2da-47ec-4a8f-b12b-59c640e0a361\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rksxf" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075457 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9rch\" (UniqueName: \"kubernetes.io/projected/e00547a5-b2bd-495e-9d84-d6d6162bf42b-kube-api-access-r9rch\") pod \"olm-operator-6b444d44fb-5ljxk\" (UID: \"e00547a5-b2bd-495e-9d84-d6d6162bf42b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5ljxk" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075477 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/53d14281-ca5f-4420-a56f-a9fd192c7e58-profile-collector-cert\") pod \"catalog-operator-68c6474976-5r8lf\" (UID: \"53d14281-ca5f-4420-a56f-a9fd192c7e58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5r8lf" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075498 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7e889bb6-3e2c-4131-a1f7-bca28b996cd3-signing-key\") pod \"service-ca-9c57cc56f-ff562\" (UID: \"7e889bb6-3e2c-4131-a1f7-bca28b996cd3\") " pod="openshift-service-ca/service-ca-9c57cc56f-ff562" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075535 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7e889bb6-3e2c-4131-a1f7-bca28b996cd3-signing-cabundle\") pod \"service-ca-9c57cc56f-ff562\" (UID: \"7e889bb6-3e2c-4131-a1f7-bca28b996cd3\") " pod="openshift-service-ca/service-ca-9c57cc56f-ff562" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075571 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/405ef230-1a28-4605-af3a-d3c7242943ce-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dshv6\" (UID: \"405ef230-1a28-4605-af3a-d3c7242943ce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dshv6" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075591 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f9e19eb-eadb-478e-b46d-f717a7a7c3de-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-b7p8b\" (UID: \"5f9e19eb-eadb-478e-b46d-f717a7a7c3de\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7p8b" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075615 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc8kl\" (UniqueName: \"kubernetes.io/projected/6f523e3f-cda8-49a8-871f-06d20ce5834e-kube-api-access-xc8kl\") pod \"multus-admission-controller-857f4d67dd-x2rv5\" (UID: \"6f523e3f-cda8-49a8-871f-06d20ce5834e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x2rv5" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075637 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr22m\" (UniqueName: \"kubernetes.io/projected/53d14281-ca5f-4420-a56f-a9fd192c7e58-kube-api-access-qr22m\") pod \"catalog-operator-68c6474976-5r8lf\" (UID: \"53d14281-ca5f-4420-a56f-a9fd192c7e58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5r8lf" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075655 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d651c778-1304-44b3-b21a-4c3ba2158db4-webhook-cert\") pod \"packageserver-d55dfcdfc-rgwf8\" (UID: \"d651c778-1304-44b3-b21a-4c3ba2158db4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rgwf8" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075697 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df6fa2da-47ec-4a8f-b12b-59c640e0a361-config\") pod \"machine-approver-56656f9798-rksxf\" (UID: \"df6fa2da-47ec-4a8f-b12b-59c640e0a361\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rksxf" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075717 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvm22\" (UniqueName: \"kubernetes.io/projected/d651c778-1304-44b3-b21a-4c3ba2158db4-kube-api-access-gvm22\") pod \"packageserver-d55dfcdfc-rgwf8\" (UID: \"d651c778-1304-44b3-b21a-4c3ba2158db4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rgwf8" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075759 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pgv2\" (UniqueName: \"kubernetes.io/projected/9bfd6550-6dba-4e2c-9b51-31134c8afa90-kube-api-access-9pgv2\") pod \"machine-config-server-vgrlm\" (UID: \"9bfd6550-6dba-4e2c-9b51-31134c8afa90\") " pod="openshift-machine-config-operator/machine-config-server-vgrlm" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075783 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9chtn\" (UniqueName: \"kubernetes.io/projected/05a01af8-3578-44e1-8398-00e57ae2c4c2-kube-api-access-9chtn\") pod \"kube-storage-version-migrator-operator-b67b599dd-m9cb8\" (UID: \"05a01af8-3578-44e1-8398-00e57ae2c4c2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9cb8" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075818 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e4224061-726c-4bee-84d0-9b1dfddcdaa4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-r4wjj\" (UID: \"e4224061-726c-4bee-84d0-9b1dfddcdaa4\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4wjj" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075846 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c59c714a-1145-4630-9d30-24d15369e2b6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4vq9f\" (UID: \"c59c714a-1145-4630-9d30-24d15369e2b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vq9f" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075871 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c59c714a-1145-4630-9d30-24d15369e2b6-config\") pod \"kube-apiserver-operator-766d6c64bb-4vq9f\" (UID: \"c59c714a-1145-4630-9d30-24d15369e2b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vq9f" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075897 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05a01af8-3578-44e1-8398-00e57ae2c4c2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-m9cb8\" (UID: \"05a01af8-3578-44e1-8398-00e57ae2c4c2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9cb8" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075926 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f3bc2762-13c0-4c5b-b3c9-af4f68e1858e-audit-dir\") pod \"apiserver-7bbb656c7d-nb9v4\" (UID: \"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075953 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7ftd\" (UniqueName: \"kubernetes.io/projected/854ec8b0-a321-4bcb-9327-96742fec3f31-kube-api-access-p7ftd\") pod \"auto-csr-approver-29564138-jzn9j\" (UID: \"854ec8b0-a321-4bcb-9327-96742fec3f31\") " pod="openshift-infra/auto-csr-approver-29564138-jzn9j" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.075985 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f3bc2762-13c0-4c5b-b3c9-af4f68e1858e-audit-policies\") pod \"apiserver-7bbb656c7d-nb9v4\" (UID: \"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.076025 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/cb247c8f-8684-477b-8c7a-4a222474a497-mountpoint-dir\") pod \"csi-hostpathplugin-4bxnr\" (UID: \"cb247c8f-8684-477b-8c7a-4a222474a497\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.076080 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05a01af8-3578-44e1-8398-00e57ae2c4c2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-m9cb8\" (UID: \"05a01af8-3578-44e1-8398-00e57ae2c4c2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9cb8" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.076107 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnz5q\" (UniqueName: \"kubernetes.io/projected/b0297eb3-0438-4db1-97bd-405779a01255-kube-api-access-gnz5q\") pod \"control-plane-machine-set-operator-78cbb6b69f-gx9sb\" (UID: \"b0297eb3-0438-4db1-97bd-405779a01255\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gx9sb" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.076145 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6f523e3f-cda8-49a8-871f-06d20ce5834e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-x2rv5\" (UID: \"6f523e3f-cda8-49a8-871f-06d20ce5834e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x2rv5" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.076167 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10213877-0b4a-463d-b286-f2a0fb2e3fd6-config-volume\") pod \"dns-default-8286f\" (UID: \"10213877-0b4a-463d-b286-f2a0fb2e3fd6\") " pod="openshift-dns/dns-default-8286f" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.076200 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3bc2762-13c0-4c5b-b3c9-af4f68e1858e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nb9v4\" (UID: \"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.076223 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l5xd\" (UniqueName: \"kubernetes.io/projected/1a9512a6-dc1a-4c12-b080-55aeb712b1f0-kube-api-access-9l5xd\") pod \"ingress-operator-5b745b69d9-ljcg6\" (UID: \"1a9512a6-dc1a-4c12-b080-55aeb712b1f0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ljcg6" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.076386 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d651c778-1304-44b3-b21a-4c3ba2158db4-tmpfs\") pod \"packageserver-d55dfcdfc-rgwf8\" (UID: \"d651c778-1304-44b3-b21a-4c3ba2158db4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rgwf8" Mar 18 15:38:52 crc kubenswrapper[4696]: E0318 15:38:52.076512 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:52.57648745 +0000 UTC m=+175.582661656 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.079654 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-485wk"] Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.080121 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75qfg\" (UniqueName: \"kubernetes.io/projected/3733dd99-82f2-4602-b0e2-ece3c16cd446-kube-api-access-75qfg\") pod \"console-f9d7485db-qbzqg\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.080953 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f3bc2762-13c0-4c5b-b3c9-af4f68e1858e-etcd-client\") pod \"apiserver-7bbb656c7d-nb9v4\" (UID: \"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.081838 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1a9512a6-dc1a-4c12-b080-55aeb712b1f0-trusted-ca\") pod \"ingress-operator-5b745b69d9-ljcg6\" (UID: \"1a9512a6-dc1a-4c12-b080-55aeb712b1f0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ljcg6" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.082092 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/cb247c8f-8684-477b-8c7a-4a222474a497-plugins-dir\") pod \"csi-hostpathplugin-4bxnr\" (UID: \"cb247c8f-8684-477b-8c7a-4a222474a497\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.082155 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/cb247c8f-8684-477b-8c7a-4a222474a497-csi-data-dir\") pod \"csi-hostpathplugin-4bxnr\" (UID: \"cb247c8f-8684-477b-8c7a-4a222474a497\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.083708 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/53d14281-ca5f-4420-a56f-a9fd192c7e58-profile-collector-cert\") pod \"catalog-operator-68c6474976-5r8lf\" (UID: \"53d14281-ca5f-4420-a56f-a9fd192c7e58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5r8lf" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.084338 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/405ef230-1a28-4605-af3a-d3c7242943ce-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dshv6\" (UID: \"405ef230-1a28-4605-af3a-d3c7242943ce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dshv6" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.085338 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05a01af8-3578-44e1-8398-00e57ae2c4c2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-m9cb8\" (UID: \"05a01af8-3578-44e1-8398-00e57ae2c4c2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9cb8" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.085416 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f3bc2762-13c0-4c5b-b3c9-af4f68e1858e-audit-dir\") pod \"apiserver-7bbb656c7d-nb9v4\" (UID: \"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.087022 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f3bc2762-13c0-4c5b-b3c9-af4f68e1858e-audit-policies\") pod \"apiserver-7bbb656c7d-nb9v4\" (UID: \"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.087122 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/cb247c8f-8684-477b-8c7a-4a222474a497-mountpoint-dir\") pod \"csi-hostpathplugin-4bxnr\" (UID: \"cb247c8f-8684-477b-8c7a-4a222474a497\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.089904 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecd7ad5d-edd9-4ae4-8b42-cc17562d6182-serving-cert\") pod \"service-ca-operator-777779d784-m4tfr\" (UID: \"ecd7ad5d-edd9-4ae4-8b42-cc17562d6182\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4tfr" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.091479 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7e889bb6-3e2c-4131-a1f7-bca28b996cd3-signing-cabundle\") pod \"service-ca-9c57cc56f-ff562\" (UID: \"7e889bb6-3e2c-4131-a1f7-bca28b996cd3\") " pod="openshift-service-ca/service-ca-9c57cc56f-ff562" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.091608 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9bfd6550-6dba-4e2c-9b51-31134c8afa90-certs\") pod \"machine-config-server-vgrlm\" (UID: \"9bfd6550-6dba-4e2c-9b51-31134c8afa90\") " pod="openshift-machine-config-operator/machine-config-server-vgrlm" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.092135 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10213877-0b4a-463d-b286-f2a0fb2e3fd6-config-volume\") pod \"dns-default-8286f\" (UID: \"10213877-0b4a-463d-b286-f2a0fb2e3fd6\") " pod="openshift-dns/dns-default-8286f" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.092714 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3bc2762-13c0-4c5b-b3c9-af4f68e1858e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nb9v4\" (UID: \"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.094724 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f3bc2762-13c0-4c5b-b3c9-af4f68e1858e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nb9v4\" (UID: \"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.099858 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f9e19eb-eadb-478e-b46d-f717a7a7c3de-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-b7p8b\" (UID: \"5f9e19eb-eadb-478e-b46d-f717a7a7c3de\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7p8b" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.100019 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c59c714a-1145-4630-9d30-24d15369e2b6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4vq9f\" (UID: \"c59c714a-1145-4630-9d30-24d15369e2b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vq9f" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.101714 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cb247c8f-8684-477b-8c7a-4a222474a497-registration-dir\") pod \"csi-hostpathplugin-4bxnr\" (UID: \"cb247c8f-8684-477b-8c7a-4a222474a497\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.102340 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecd7ad5d-edd9-4ae4-8b42-cc17562d6182-config\") pod \"service-ca-operator-777779d784-m4tfr\" (UID: \"ecd7ad5d-edd9-4ae4-8b42-cc17562d6182\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4tfr" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.102485 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/53d14281-ca5f-4420-a56f-a9fd192c7e58-srv-cert\") pod \"catalog-operator-68c6474976-5r8lf\" (UID: \"53d14281-ca5f-4420-a56f-a9fd192c7e58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5r8lf" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.103055 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8339464b-c883-44d7-95eb-57c32689e91b-config-volume\") pod \"collect-profiles-29564130-4tdt6\" (UID: \"8339464b-c883-44d7-95eb-57c32689e91b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-4tdt6" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.103900 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e4224061-726c-4bee-84d0-9b1dfddcdaa4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-r4wjj\" (UID: \"e4224061-726c-4bee-84d0-9b1dfddcdaa4\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4wjj" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.104721 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/405ef230-1a28-4605-af3a-d3c7242943ce-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dshv6\" (UID: \"405ef230-1a28-4605-af3a-d3c7242943ce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dshv6" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.104791 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df6fa2da-47ec-4a8f-b12b-59c640e0a361-config\") pod \"machine-approver-56656f9798-rksxf\" (UID: \"df6fa2da-47ec-4a8f-b12b-59c640e0a361\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rksxf" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.104943 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cb247c8f-8684-477b-8c7a-4a222474a497-socket-dir\") pod \"csi-hostpathplugin-4bxnr\" (UID: \"cb247c8f-8684-477b-8c7a-4a222474a497\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.105009 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c59c714a-1145-4630-9d30-24d15369e2b6-config\") pod \"kube-apiserver-operator-766d6c64bb-4vq9f\" (UID: \"c59c714a-1145-4630-9d30-24d15369e2b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vq9f" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.105380 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/df6fa2da-47ec-4a8f-b12b-59c640e0a361-auth-proxy-config\") pod \"machine-approver-56656f9798-rksxf\" (UID: \"df6fa2da-47ec-4a8f-b12b-59c640e0a361\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rksxf" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.105420 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.106092 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05a01af8-3578-44e1-8398-00e57ae2c4c2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-m9cb8\" (UID: \"05a01af8-3578-44e1-8398-00e57ae2c4c2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9cb8" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.111154 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jnnmf"] Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.112077 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6f523e3f-cda8-49a8-871f-06d20ce5834e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-x2rv5\" (UID: \"6f523e3f-cda8-49a8-871f-06d20ce5834e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x2rv5" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.113802 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/df6fa2da-47ec-4a8f-b12b-59c640e0a361-machine-approver-tls\") pod \"machine-approver-56656f9798-rksxf\" (UID: \"df6fa2da-47ec-4a8f-b12b-59c640e0a361\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rksxf" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.114286 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/10213877-0b4a-463d-b286-f2a0fb2e3fd6-metrics-tls\") pod \"dns-default-8286f\" (UID: \"10213877-0b4a-463d-b286-f2a0fb2e3fd6\") " pod="openshift-dns/dns-default-8286f" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.116197 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2dbc3c5-555b-4b76-af97-bbe1de318efe-cert\") pod \"ingress-canary-2x748\" (UID: \"b2dbc3c5-555b-4b76-af97-bbe1de318efe\") " pod="openshift-ingress-canary/ingress-canary-2x748" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.122001 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d651c778-1304-44b3-b21a-4c3ba2158db4-webhook-cert\") pod \"packageserver-d55dfcdfc-rgwf8\" (UID: \"d651c778-1304-44b3-b21a-4c3ba2158db4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rgwf8" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.122042 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e00547a5-b2bd-495e-9d84-d6d6162bf42b-srv-cert\") pod \"olm-operator-6b444d44fb-5ljxk\" (UID: \"e00547a5-b2bd-495e-9d84-d6d6162bf42b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5ljxk" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.122209 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e00547a5-b2bd-495e-9d84-d6d6162bf42b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5ljxk\" (UID: \"e00547a5-b2bd-495e-9d84-d6d6162bf42b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5ljxk" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.122312 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a9512a6-dc1a-4c12-b080-55aeb712b1f0-metrics-tls\") pod \"ingress-operator-5b745b69d9-ljcg6\" (UID: \"1a9512a6-dc1a-4c12-b080-55aeb712b1f0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ljcg6" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.122410 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b0297eb3-0438-4db1-97bd-405779a01255-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gx9sb\" (UID: \"b0297eb3-0438-4db1-97bd-405779a01255\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gx9sb" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.122462 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8339464b-c883-44d7-95eb-57c32689e91b-secret-volume\") pod \"collect-profiles-29564130-4tdt6\" (UID: \"8339464b-c883-44d7-95eb-57c32689e91b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-4tdt6" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.122726 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7e889bb6-3e2c-4131-a1f7-bca28b996cd3-signing-key\") pod \"service-ca-9c57cc56f-ff562\" (UID: \"7e889bb6-3e2c-4131-a1f7-bca28b996cd3\") " pod="openshift-service-ca/service-ca-9c57cc56f-ff562" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.122778 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9bfd6550-6dba-4e2c-9b51-31134c8afa90-node-bootstrap-token\") pod \"machine-config-server-vgrlm\" (UID: \"9bfd6550-6dba-4e2c-9b51-31134c8afa90\") " pod="openshift-machine-config-operator/machine-config-server-vgrlm" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.123016 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f3bc2762-13c0-4c5b-b3c9-af4f68e1858e-encryption-config\") pod \"apiserver-7bbb656c7d-nb9v4\" (UID: \"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.123732 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e4224061-726c-4bee-84d0-9b1dfddcdaa4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-r4wjj\" (UID: \"e4224061-726c-4bee-84d0-9b1dfddcdaa4\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4wjj" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.125594 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d651c778-1304-44b3-b21a-4c3ba2158db4-apiservice-cert\") pod \"packageserver-d55dfcdfc-rgwf8\" (UID: \"d651c778-1304-44b3-b21a-4c3ba2158db4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rgwf8" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.128934 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3bc2762-13c0-4c5b-b3c9-af4f68e1858e-serving-cert\") pod \"apiserver-7bbb656c7d-nb9v4\" (UID: \"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.129114 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zthn2\" (UniqueName: \"kubernetes.io/projected/363b3949-b8a3-4fd4-a13d-281d29c61822-kube-api-access-zthn2\") pod \"apiserver-76f77b778f-6k8ll\" (UID: \"363b3949-b8a3-4fd4-a13d-281d29c61822\") " pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.130136 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k779w\" (UniqueName: \"kubernetes.io/projected/198822b4-2fd5-4225-bfe8-f233eaf3d8fc-kube-api-access-k779w\") pod \"cluster-samples-operator-665b6dd947-gdd4w\" (UID: \"198822b4-2fd5-4225-bfe8-f233eaf3d8fc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdd4w" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.141735 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8mtz5"] Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.158327 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5f87\" (UniqueName: \"kubernetes.io/projected/f7245cec-cee9-4aa5-8087-81ad2f450977-kube-api-access-c5f87\") pod \"downloads-7954f5f757-498px\" (UID: \"f7245cec-cee9-4aa5-8087-81ad2f450977\") " pod="openshift-console/downloads-7954f5f757-498px" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.162951 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdd4w" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.179849 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zspsz\" (UniqueName: \"kubernetes.io/projected/90a9c50c-4a8b-4731-9ee8-addc5dfb22f7-kube-api-access-zspsz\") pod \"openshift-apiserver-operator-796bbdcf4f-vpp52\" (UID: \"90a9c50c-4a8b-4731-9ee8-addc5dfb22f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vpp52" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.185838 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xk9kb" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.186790 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:52 crc kubenswrapper[4696]: E0318 15:38:52.187218 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:52.687205537 +0000 UTC m=+175.693379743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.187601 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-498px" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.239371 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmj4r\" (UniqueName: \"kubernetes.io/projected/5744ac11-6c36-4634-903e-298dc7b5ce45-kube-api-access-dmj4r\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.242510 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bvmx5"] Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.244862 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4qhl8" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.247727 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gxl5n" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.253186 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t98l6\" (UniqueName: \"kubernetes.io/projected/c38fdccb-264a-4d02-9f5c-19140a3df2f2-kube-api-access-t98l6\") pod \"console-operator-58897d9998-9wfks\" (UID: \"c38fdccb-264a-4d02-9f5c-19140a3df2f2\") " pod="openshift-console-operator/console-operator-58897d9998-9wfks" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.253821 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n7svq" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.254341 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/28cd9598-75cc-4316-a078-c4369354b5af-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cr7hv\" (UID: \"28cd9598-75cc-4316-a078-c4369354b5af\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr7hv" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.261619 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr7hv" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.266660 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pk8t\" (UniqueName: \"kubernetes.io/projected/0c05edf1-5079-4212-ba5c-19621b2500cf-kube-api-access-6pk8t\") pod \"machine-api-operator-5694c8668f-6xjgb\" (UID: \"0c05edf1-5079-4212-ba5c-19621b2500cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6xjgb" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.286425 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwglp\" (UniqueName: \"kubernetes.io/projected/a6eeff11-0e98-48f5-a868-a7016d57be14-kube-api-access-jwglp\") pod \"route-controller-manager-6576b87f9c-kn2g2\" (UID: \"a6eeff11-0e98-48f5-a868-a7016d57be14\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.288108 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:52 crc kubenswrapper[4696]: E0318 15:38:52.288679 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:52.78865655 +0000 UTC m=+175.794830756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.289463 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-g5mnq"] Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.296126 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc6gb\" (UniqueName: \"kubernetes.io/projected/ec637495-0e89-4ff0-9f59-2079144aa380-kube-api-access-tc6gb\") pod \"authentication-operator-69f744f599-75kz8\" (UID: \"ec637495-0e89-4ff0-9f59-2079144aa380\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kz8" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.307691 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-75kz8" Mar 18 15:38:52 crc kubenswrapper[4696]: W0318 15:38:52.317005 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b9aec63_9194_4040_b89c_6985e68607b9.slice/crio-fccc03ed345afb74dd254130d5f77a65380bc7e6e7b45ccdb4b7afddfc302f31 WatchSource:0}: Error finding container fccc03ed345afb74dd254130d5f77a65380bc7e6e7b45ccdb4b7afddfc302f31: Status 404 returned error can't find the container with id fccc03ed345afb74dd254130d5f77a65380bc7e6e7b45ccdb4b7afddfc302f31 Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.318430 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtvvw\" (UniqueName: \"kubernetes.io/projected/55f95b82-6e61-4b39-a2e6-6685d37d61e8-kube-api-access-wtvvw\") pod \"dns-operator-744455d44c-69fs4\" (UID: \"55f95b82-6e61-4b39-a2e6-6685d37d61e8\") " pod="openshift-dns-operator/dns-operator-744455d44c-69fs4" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.340743 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l5xd\" (UniqueName: \"kubernetes.io/projected/1a9512a6-dc1a-4c12-b080-55aeb712b1f0-kube-api-access-9l5xd\") pod \"ingress-operator-5b745b69d9-ljcg6\" (UID: \"1a9512a6-dc1a-4c12-b080-55aeb712b1f0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ljcg6" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.352915 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vpp52" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.353018 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.379662 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4zn8\" (UniqueName: \"kubernetes.io/projected/df6fa2da-47ec-4a8f-b12b-59c640e0a361-kube-api-access-f4zn8\") pod \"machine-approver-56656f9798-rksxf\" (UID: \"df6fa2da-47ec-4a8f-b12b-59c640e0a361\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rksxf" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.382206 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1a9512a6-dc1a-4c12-b080-55aeb712b1f0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ljcg6\" (UID: \"1a9512a6-dc1a-4c12-b080-55aeb712b1f0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ljcg6" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.393417 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.393589 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2" Mar 18 15:38:52 crc kubenswrapper[4696]: E0318 15:38:52.393799 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:52.893785556 +0000 UTC m=+175.899959762 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.397112 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xk9kb" event={"ID":"94443ebd-69c4-4f6b-90a6-13cd2da51741","Type":"ContainerStarted","Data":"320b687115688ca0de000df4e6951c0abcfa7810ce33648bb46f963f597c35f2"} Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.398699 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-8mtz5" event={"ID":"d80e308e-71d5-484e-bdc3-ac15ef240b46","Type":"ContainerStarted","Data":"218bc365739ef81c9b39ed7674003ae8ea46647eca241a9bccb5d1e5fd1aaf8c"} Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.402996 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhvh4\" (UniqueName: \"kubernetes.io/projected/7e889bb6-3e2c-4131-a1f7-bca28b996cd3-kube-api-access-zhvh4\") pod \"service-ca-9c57cc56f-ff562\" (UID: \"7e889bb6-3e2c-4131-a1f7-bca28b996cd3\") " pod="openshift-service-ca/service-ca-9c57cc56f-ff562" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.406254 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jnnmf" event={"ID":"65110067-19f3-4355-bac5-fe08d6a07311","Type":"ContainerStarted","Data":"a97bb0e2191f8eea9faefb5d62648367087cdbaf29cd33b9f0bbee32c37a4f21"} Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.406321 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jnnmf" event={"ID":"65110067-19f3-4355-bac5-fe08d6a07311","Type":"ContainerStarted","Data":"f7aa3800745c0c2325793f85abaad79085127791c005fdec7d622b542f0df6f9"} Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.411819 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" event={"ID":"ce730c10-9854-4705-bac9-07fc1f23402c","Type":"ContainerStarted","Data":"2571b91d1d6ca9a6db234b9074f24029e94985c3ccaeee07b2f61e6099f0f31a"} Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.411881 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" event={"ID":"ce730c10-9854-4705-bac9-07fc1f23402c","Type":"ContainerStarted","Data":"cf7fd7a402f91b8d41dbf8d6de7c2936280f5d9cddbbad7090bd757d975e6807"} Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.413620 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.415033 4696 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-mcsh8 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.415078 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" podUID="ce730c10-9854-4705-bac9-07fc1f23402c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.419967 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g5mnq" event={"ID":"6b9aec63-9194-4040-b89c-6985e68607b9","Type":"ContainerStarted","Data":"fccc03ed345afb74dd254130d5f77a65380bc7e6e7b45ccdb4b7afddfc302f31"} Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.422158 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7ftd\" (UniqueName: \"kubernetes.io/projected/854ec8b0-a321-4bcb-9327-96742fec3f31-kube-api-access-p7ftd\") pod \"auto-csr-approver-29564138-jzn9j\" (UID: \"854ec8b0-a321-4bcb-9327-96742fec3f31\") " pod="openshift-infra/auto-csr-approver-29564138-jzn9j" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.423034 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-485wk" event={"ID":"a589c8ef-17db-4df7-affb-8a40c753aaaa","Type":"ContainerStarted","Data":"25a5ad267780eb3af8c9d8dbcf28a087a20c3ea4016b8a55510b450eef7e06de"} Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.425813 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvmx5" event={"ID":"80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b","Type":"ContainerStarted","Data":"0a3ca4e14fe53b8524094d20f76759906cd48ae987da0ab2bce3c4a3a36611eb"} Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.448104 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc8kl\" (UniqueName: \"kubernetes.io/projected/6f523e3f-cda8-49a8-871f-06d20ce5834e-kube-api-access-xc8kl\") pod \"multus-admission-controller-857f4d67dd-x2rv5\" (UID: \"6f523e3f-cda8-49a8-871f-06d20ce5834e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x2rv5" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.459903 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-6xjgb" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.469230 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/405ef230-1a28-4605-af3a-d3c7242943ce-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dshv6\" (UID: \"405ef230-1a28-4605-af3a-d3c7242943ce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dshv6" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.481471 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9rch\" (UniqueName: \"kubernetes.io/projected/e00547a5-b2bd-495e-9d84-d6d6162bf42b-kube-api-access-r9rch\") pod \"olm-operator-6b444d44fb-5ljxk\" (UID: \"e00547a5-b2bd-495e-9d84-d6d6162bf42b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5ljxk" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.482260 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-qbzqg"] Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.498202 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:52 crc kubenswrapper[4696]: E0318 15:38:52.498349 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:52.998327217 +0000 UTC m=+176.004501433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.498589 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:52 crc kubenswrapper[4696]: E0318 15:38:52.499865 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:52.999854016 +0000 UTC m=+176.006028312 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.514153 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnz5q\" (UniqueName: \"kubernetes.io/projected/b0297eb3-0438-4db1-97bd-405779a01255-kube-api-access-gnz5q\") pod \"control-plane-machine-set-operator-78cbb6b69f-gx9sb\" (UID: \"b0297eb3-0438-4db1-97bd-405779a01255\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gx9sb" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.525119 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-498px"] Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.525667 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfj4j\" (UniqueName: \"kubernetes.io/projected/ecd7ad5d-edd9-4ae4-8b42-cc17562d6182-kube-api-access-pfj4j\") pod \"service-ca-operator-777779d784-m4tfr\" (UID: \"ecd7ad5d-edd9-4ae4-8b42-cc17562d6182\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4tfr" Mar 18 15:38:52 crc kubenswrapper[4696]: W0318 15:38:52.526465 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3733dd99_82f2_4602_b0e2_ece3c16cd446.slice/crio-d6fbf371b1d83b5ff607d9ca941119f047eccd764e55b0ea4fa53819d52df8a3 WatchSource:0}: Error finding container d6fbf371b1d83b5ff607d9ca941119f047eccd764e55b0ea4fa53819d52df8a3: Status 404 returned error can't find the container with id d6fbf371b1d83b5ff607d9ca941119f047eccd764e55b0ea4fa53819d52df8a3 Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.529431 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-9wfks" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.535836 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-69fs4" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.554357 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxhmx\" (UniqueName: \"kubernetes.io/projected/f3bc2762-13c0-4c5b-b3c9-af4f68e1858e-kube-api-access-cxhmx\") pod \"apiserver-7bbb656c7d-nb9v4\" (UID: \"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.562940 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv8hv\" (UniqueName: \"kubernetes.io/projected/8339464b-c883-44d7-95eb-57c32689e91b-kube-api-access-qv8hv\") pod \"collect-profiles-29564130-4tdt6\" (UID: \"8339464b-c883-44d7-95eb-57c32689e91b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-4tdt6" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.580909 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-x2rv5" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.587011 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4qhl8"] Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.587133 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-4tdt6" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.594870 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.600196 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rksxf" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.601101 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.601168 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pgv2\" (UniqueName: \"kubernetes.io/projected/9bfd6550-6dba-4e2c-9b51-31134c8afa90-kube-api-access-9pgv2\") pod \"machine-config-server-vgrlm\" (UID: \"9bfd6550-6dba-4e2c-9b51-31134c8afa90\") " pod="openshift-machine-config-operator/machine-config-server-vgrlm" Mar 18 15:38:52 crc kubenswrapper[4696]: E0318 15:38:52.601379 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:53.10136313 +0000 UTC m=+176.107537426 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.601447 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:52 crc kubenswrapper[4696]: E0318 15:38:52.608414 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:53.108376447 +0000 UTC m=+176.114550653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.616030 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5ljxk" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.622061 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4tfr" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.625496 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr22m\" (UniqueName: \"kubernetes.io/projected/53d14281-ca5f-4420-a56f-a9fd192c7e58-kube-api-access-qr22m\") pod \"catalog-operator-68c6474976-5r8lf\" (UID: \"53d14281-ca5f-4420-a56f-a9fd192c7e58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5r8lf" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.628569 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5r8lf" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.629281 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9chtn\" (UniqueName: \"kubernetes.io/projected/05a01af8-3578-44e1-8398-00e57ae2c4c2-kube-api-access-9chtn\") pod \"kube-storage-version-migrator-operator-b67b599dd-m9cb8\" (UID: \"05a01af8-3578-44e1-8398-00e57ae2c4c2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9cb8" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.633949 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gx9sb" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.639589 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdd4w"] Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.651453 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ljcg6" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.655260 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dshv6" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.659366 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f664j\" (UniqueName: \"kubernetes.io/projected/cb247c8f-8684-477b-8c7a-4a222474a497-kube-api-access-f664j\") pod \"csi-hostpathplugin-4bxnr\" (UID: \"cb247c8f-8684-477b-8c7a-4a222474a497\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.661445 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgprf\" (UniqueName: \"kubernetes.io/projected/10213877-0b4a-463d-b286-f2a0fb2e3fd6-kube-api-access-pgprf\") pod \"dns-default-8286f\" (UID: \"10213877-0b4a-463d-b286-f2a0fb2e3fd6\") " pod="openshift-dns/dns-default-8286f" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.667121 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9cb8" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.667745 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ff562" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.683015 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-vgrlm" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.701048 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c59c714a-1145-4630-9d30-24d15369e2b6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4vq9f\" (UID: \"c59c714a-1145-4630-9d30-24d15369e2b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vq9f" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.701694 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564138-jzn9j" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.702901 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qm94\" (UniqueName: \"kubernetes.io/projected/5f9e19eb-eadb-478e-b46d-f717a7a7c3de-kube-api-access-7qm94\") pod \"package-server-manager-789f6589d5-b7p8b\" (UID: \"5f9e19eb-eadb-478e-b46d-f717a7a7c3de\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7p8b" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.709255 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.709991 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8286f" Mar 18 15:38:52 crc kubenswrapper[4696]: E0318 15:38:52.710054 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:53.210036945 +0000 UTC m=+176.216211151 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.715745 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.717446 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59vks\" (UniqueName: \"kubernetes.io/projected/b2dbc3c5-555b-4b76-af97-bbe1de318efe-kube-api-access-59vks\") pod \"ingress-canary-2x748\" (UID: \"b2dbc3c5-555b-4b76-af97-bbe1de318efe\") " pod="openshift-ingress-canary/ingress-canary-2x748" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.733114 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2x748" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.741723 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvm22\" (UniqueName: \"kubernetes.io/projected/d651c778-1304-44b3-b21a-4c3ba2158db4-kube-api-access-gvm22\") pod \"packageserver-d55dfcdfc-rgwf8\" (UID: \"d651c778-1304-44b3-b21a-4c3ba2158db4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rgwf8" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.775276 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2bmg\" (UniqueName: \"kubernetes.io/projected/e4224061-726c-4bee-84d0-9b1dfddcdaa4-kube-api-access-f2bmg\") pod \"marketplace-operator-79b997595-r4wjj\" (UID: \"e4224061-726c-4bee-84d0-9b1dfddcdaa4\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4wjj" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.811252 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:52 crc kubenswrapper[4696]: E0318 15:38:52.811851 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:53.311825407 +0000 UTC m=+176.317999613 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.874993 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r4wjj" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.906662 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7p8b" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.912158 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:52 crc kubenswrapper[4696]: E0318 15:38:52.912335 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:53.412297256 +0000 UTC m=+176.418471462 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.912725 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:52 crc kubenswrapper[4696]: E0318 15:38:52.913210 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:53.413198718 +0000 UTC m=+176.419372924 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.940670 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr7hv"] Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.942933 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vq9f" Mar 18 15:38:52 crc kubenswrapper[4696]: I0318 15:38:52.979307 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rgwf8" Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.014182 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:53 crc kubenswrapper[4696]: E0318 15:38:53.015219 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:53.515196325 +0000 UTC m=+176.521370531 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.076849 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6k8ll"] Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.089940 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-n7svq"] Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.106117 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gxl5n"] Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.116663 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:53 crc kubenswrapper[4696]: E0318 15:38:53.117023 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:53.617009477 +0000 UTC m=+176.623183683 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.119623 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-75kz8"] Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.217558 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:53 crc kubenswrapper[4696]: E0318 15:38:53.218379 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:53.718359248 +0000 UTC m=+176.724533444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.323176 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:53 crc kubenswrapper[4696]: E0318 15:38:53.323627 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:53.823606646 +0000 UTC m=+176.829780852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.436422 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:53 crc kubenswrapper[4696]: E0318 15:38:53.437868 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:53.937837051 +0000 UTC m=+176.944011267 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.456877 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gxl5n" event={"ID":"09cb0a8a-4464-4d3c-93a4-0d933d0fff77","Type":"ContainerStarted","Data":"6d4522a7831779fe09dc8bd1fe3cbaf5b80363a27eda661b480422eae28c0ac6"} Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.496774 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" event={"ID":"363b3949-b8a3-4fd4-a13d-281d29c61822","Type":"ContainerStarted","Data":"feaee5bb51658ff21c67ab8000704a5a30ed164002e55044ad6b681ee8a15eb0"} Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.499482 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-vgrlm" event={"ID":"9bfd6550-6dba-4e2c-9b51-31134c8afa90","Type":"ContainerStarted","Data":"2a3e4629f0c2f8b33a79d0f5b62e06c2eb91b75d498b8f2f42410493ba18af5f"} Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.501017 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n7svq" event={"ID":"914a7337-c621-43a6-8294-50390f768e28","Type":"ContainerStarted","Data":"985fafee8faf3b06cd1e5ce7655a2db4a0210d18e56313e0ad4feadedbef5eb9"} Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.505332 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4qhl8" event={"ID":"102d31d1-4cba-46d5-8f36-727dd4379b90","Type":"ContainerStarted","Data":"2f7d062eea1e308a46e412a2c4abdca5ac51df5483fd8c17ddb60186ec57b97c"} Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.521486 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-75kz8" event={"ID":"ec637495-0e89-4ff0-9f59-2079144aa380","Type":"ContainerStarted","Data":"105792a44b73f955689f145f49e8a091f933c846c5f25a4aa548dc84448be07a"} Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.527373 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xk9kb" event={"ID":"94443ebd-69c4-4f6b-90a6-13cd2da51741","Type":"ContainerStarted","Data":"62d6683850a6dee34d555b075fc2636f5db92a8014179c80aa9e799f6e5636d1"} Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.531260 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-8mtz5" event={"ID":"d80e308e-71d5-484e-bdc3-ac15ef240b46","Type":"ContainerStarted","Data":"2569f51e44dae5be957cf64b9452ba9c208b01c5a06baf271c89339bb9a1730a"} Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.541361 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:53 crc kubenswrapper[4696]: E0318 15:38:53.541771 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:54.041756267 +0000 UTC m=+177.047930473 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.542730 4696 generic.go:334] "Generic (PLEG): container finished" podID="6b9aec63-9194-4040-b89c-6985e68607b9" containerID="bbd5266f8b1eb57a3d33f2f274649892355840031dda96b67b0dd471475d3ab7" exitCode=0 Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.542785 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g5mnq" event={"ID":"6b9aec63-9194-4040-b89c-6985e68607b9","Type":"ContainerDied","Data":"bbd5266f8b1eb57a3d33f2f274649892355840031dda96b67b0dd471475d3ab7"} Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.546045 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-qbzqg" event={"ID":"3733dd99-82f2-4602-b0e2-ece3c16cd446","Type":"ContainerStarted","Data":"fd96ab33d3dfaf55f17398e09133459f2f324fb276ebeacb70928223e74d619f"} Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.546078 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-qbzqg" event={"ID":"3733dd99-82f2-4602-b0e2-ece3c16cd446","Type":"ContainerStarted","Data":"d6fbf371b1d83b5ff607d9ca941119f047eccd764e55b0ea4fa53819d52df8a3"} Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.552309 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rksxf" event={"ID":"df6fa2da-47ec-4a8f-b12b-59c640e0a361","Type":"ContainerStarted","Data":"cbd067f8933c7b813f5a534078c26f7916b24c3b38fe5d04cd2a64d797b803fd"} Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.554329 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr7hv" event={"ID":"28cd9598-75cc-4316-a078-c4369354b5af","Type":"ContainerStarted","Data":"45eabdf9e1294891ad9403b7d45e00c4eb0888b9fd8d15cb2e2a9df065978127"} Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.564671 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvmx5" event={"ID":"80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b","Type":"ContainerStarted","Data":"525b2a88bde634934cf2508af1f388790ec77703356bbf90faf081f5238491dc"} Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.571760 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdd4w" event={"ID":"198822b4-2fd5-4225-bfe8-f233eaf3d8fc","Type":"ContainerStarted","Data":"360b38efb5d4d6018d8a18d6206bf143dd11c78133a89d0bc8a35975b8f01d2b"} Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.586438 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-498px" event={"ID":"f7245cec-cee9-4aa5-8087-81ad2f450977","Type":"ContainerStarted","Data":"8484f834f10d0249a76ecabcbc0840559a8e9b362748e40d3fbb189409471e25"} Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.604566 4696 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-485wk container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.20:6443/healthz\": dial tcp 10.217.0.20:6443: connect: connection refused" start-of-body= Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.604654 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-485wk" podUID="a589c8ef-17db-4df7-affb-8a40c753aaaa" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.20:6443/healthz\": dial tcp 10.217.0.20:6443: connect: connection refused" Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.642317 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:53 crc kubenswrapper[4696]: E0318 15:38:53.642648 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:54.142627045 +0000 UTC m=+177.148801261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.642892 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:53 crc kubenswrapper[4696]: E0318 15:38:53.644698 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:54.144686687 +0000 UTC m=+177.150861123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.744696 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:53 crc kubenswrapper[4696]: E0318 15:38:53.746701 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:54.246671454 +0000 UTC m=+177.252845670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.782856 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-xk9kb" podStartSLOduration=129.782832764 podStartE2EDuration="2m9.782832764s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:53.78269129 +0000 UTC m=+176.788865516" watchObservedRunningTime="2026-03-18 15:38:53.782832764 +0000 UTC m=+176.789006970" Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.802381 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-485wk" event={"ID":"a589c8ef-17db-4df7-affb-8a40c753aaaa","Type":"ContainerStarted","Data":"3b8b6835ca133944604761dba0aca9641f0ae62efd6047b4eb321034ffac36b7"} Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.802440 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.802502 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.847474 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:53 crc kubenswrapper[4696]: E0318 15:38:53.848145 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:54.348124937 +0000 UTC m=+177.354299153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.951467 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:53 crc kubenswrapper[4696]: E0318 15:38:53.951844 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:54.451814947 +0000 UTC m=+177.457989153 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:53 crc kubenswrapper[4696]: I0318 15:38:53.954244 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:53 crc kubenswrapper[4696]: E0318 15:38:53.960633 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:54.460606208 +0000 UTC m=+177.466780414 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.024499 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jnnmf" podStartSLOduration=130.024472455 podStartE2EDuration="2m10.024472455s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:54.023202383 +0000 UTC m=+177.029376579" watchObservedRunningTime="2026-03-18 15:38:54.024472455 +0000 UTC m=+177.030646661" Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.056176 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:54 crc kubenswrapper[4696]: E0318 15:38:54.056469 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:54.55645393 +0000 UTC m=+177.562628136 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.157808 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:54 crc kubenswrapper[4696]: E0318 15:38:54.158565 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:54.65854989 +0000 UTC m=+177.664724096 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.190338 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-xk9kb" Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.260663 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:54 crc kubenswrapper[4696]: E0318 15:38:54.261196 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:54.761148232 +0000 UTC m=+177.767322448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.262835 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-qbzqg" podStartSLOduration=130.262804274 podStartE2EDuration="2m10.262804274s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:54.26267836 +0000 UTC m=+177.268852566" watchObservedRunningTime="2026-03-18 15:38:54.262804274 +0000 UTC m=+177.268978480" Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.264845 4696 patch_prober.go:28] interesting pod/router-default-5444994796-xk9kb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:38:54 crc kubenswrapper[4696]: [-]has-synced failed: reason withheld Mar 18 15:38:54 crc kubenswrapper[4696]: [+]process-running ok Mar 18 15:38:54 crc kubenswrapper[4696]: healthz check failed Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.264902 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xk9kb" podUID="94443ebd-69c4-4f6b-90a6-13cd2da51741" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.292964 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" podStartSLOduration=130.292927522 podStartE2EDuration="2m10.292927522s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:54.194928765 +0000 UTC m=+177.201102971" watchObservedRunningTime="2026-03-18 15:38:54.292927522 +0000 UTC m=+177.299101728" Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.301383 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-8mtz5" podStartSLOduration=130.301353384 podStartE2EDuration="2m10.301353384s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:54.300386299 +0000 UTC m=+177.306560515" watchObservedRunningTime="2026-03-18 15:38:54.301353384 +0000 UTC m=+177.307527590" Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.365730 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:54 crc kubenswrapper[4696]: E0318 15:38:54.366099 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:54.866087463 +0000 UTC m=+177.872261669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.470869 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:54 crc kubenswrapper[4696]: E0318 15:38:54.471686 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:54.9716688 +0000 UTC m=+177.977843006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.556170 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-6xjgb"] Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.573439 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:54 crc kubenswrapper[4696]: E0318 15:38:54.574252 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:55.074237892 +0000 UTC m=+178.080412088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.612557 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvmx5" podStartSLOduration=129.612535525 podStartE2EDuration="2m9.612535525s" podCreationTimestamp="2026-03-18 15:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:54.609959181 +0000 UTC m=+177.616133407" watchObservedRunningTime="2026-03-18 15:38:54.612535525 +0000 UTC m=+177.618709751" Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.633895 4696 generic.go:334] "Generic (PLEG): container finished" podID="363b3949-b8a3-4fd4-a13d-281d29c61822" containerID="be7ca016bfac9c0943b7bebfea9dedfee364400ceafa57d97e9f3e93b8d1f180" exitCode=0 Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.634296 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" event={"ID":"363b3949-b8a3-4fd4-a13d-281d29c61822","Type":"ContainerDied","Data":"be7ca016bfac9c0943b7bebfea9dedfee364400ceafa57d97e9f3e93b8d1f180"} Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.646077 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gxl5n" event={"ID":"09cb0a8a-4464-4d3c-93a4-0d933d0fff77","Type":"ContainerStarted","Data":"6a7380c32c29f1a8a2312876dc962256b06a21c52c536b346e7bc38e9ff6aa7b"} Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.655386 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rksxf" event={"ID":"df6fa2da-47ec-4a8f-b12b-59c640e0a361","Type":"ContainerStarted","Data":"e5431350e2473bca450ac012705e9eb05ffc78b7bc50ed5ce03f020d9018b990"} Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.658787 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-75kz8" event={"ID":"ec637495-0e89-4ff0-9f59-2079144aa380","Type":"ContainerStarted","Data":"7cbe079ae5b56347fdf8b897a501282c67fc386781494d8719e6a52e8870ac3c"} Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.665189 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n7svq" event={"ID":"914a7337-c621-43a6-8294-50390f768e28","Type":"ContainerStarted","Data":"37c78ef493733ded48f1991545b25774123024cd52815a02baa73d28705ce51a"} Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.677599 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:54 crc kubenswrapper[4696]: E0318 15:38:54.678312 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:55.17829518 +0000 UTC m=+178.184469396 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.722312 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4qhl8" event={"ID":"102d31d1-4cba-46d5-8f36-727dd4379b90","Type":"ContainerStarted","Data":"11dc167c33c19c26400906705b585a07c5f5db8b540a0a3b8a662a185ebc6878"} Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.722858 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4qhl8" event={"ID":"102d31d1-4cba-46d5-8f36-727dd4379b90","Type":"ContainerStarted","Data":"a7278feb5a924204f430e00a3472aeda028ba5202b49b351054e08e3efd6c51c"} Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.735613 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-498px" event={"ID":"f7245cec-cee9-4aa5-8087-81ad2f450977","Type":"ContainerStarted","Data":"6a34d0e8c6fd11595952f9db725b74545f8feb0dba52347f623f45c7e2bdbd09"} Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.739320 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-498px" Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.750136 4696 patch_prober.go:28] interesting pod/downloads-7954f5f757-498px container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.750195 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-498px" podUID="f7245cec-cee9-4aa5-8087-81ad2f450977" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.765695 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g5mnq" event={"ID":"6b9aec63-9194-4040-b89c-6985e68607b9","Type":"ContainerStarted","Data":"242f29ba105dbee72f407455722908b0adcdeb84b0ad6f08a9737bc65bd26631"} Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.766772 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g5mnq" Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.782536 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:54 crc kubenswrapper[4696]: E0318 15:38:54.782927 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:55.282914513 +0000 UTC m=+178.289088719 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.789632 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-vgrlm" event={"ID":"9bfd6550-6dba-4e2c-9b51-31134c8afa90","Type":"ContainerStarted","Data":"8184521fe94cfc4f197cbd0d2bd54290c9f1d6b3fe30da5e46b3e6a844ad1b84"} Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.844503 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gxl5n" podStartSLOduration=130.844485063 podStartE2EDuration="2m10.844485063s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:54.843485708 +0000 UTC m=+177.849659914" watchObservedRunningTime="2026-03-18 15:38:54.844485063 +0000 UTC m=+177.850659269" Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.845162 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvmx5" event={"ID":"80f4cff4-8c9e-42f6-a4a8-26c0320f6a8b","Type":"ContainerStarted","Data":"f466d2c94b47bdb2a8e053d1ef2bb88da7a89e80a1224c3fa42a842d1c65e3d2"} Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.874588 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr7hv" event={"ID":"28cd9598-75cc-4316-a078-c4369354b5af","Type":"ContainerStarted","Data":"79e7dba70f5090fdccfcb1ef5af297818d10b68baa201a18b9de47e1d8abe919"} Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.885169 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:54 crc kubenswrapper[4696]: E0318 15:38:54.886309 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:55.386294535 +0000 UTC m=+178.392468741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.922468 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdd4w" event={"ID":"198822b4-2fd5-4225-bfe8-f233eaf3d8fc","Type":"ContainerStarted","Data":"2d801e8c553330a04c42b0ab929836468e18d8779760007ef05f4359aee9f997"} Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.922506 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdd4w" event={"ID":"198822b4-2fd5-4225-bfe8-f233eaf3d8fc","Type":"ContainerStarted","Data":"b64aa0c5a6811cd999646daa203fe8f09d6406b7ecaee641b0316ae48aecb986"} Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.928693 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-75kz8" podStartSLOduration=130.928670882 podStartE2EDuration="2m10.928670882s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:54.874258222 +0000 UTC m=+177.880432428" watchObservedRunningTime="2026-03-18 15:38:54.928670882 +0000 UTC m=+177.934845088" Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.976891 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-485wk" podStartSLOduration=130.976871725 podStartE2EDuration="2m10.976871725s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:54.975338916 +0000 UTC m=+177.981513122" watchObservedRunningTime="2026-03-18 15:38:54.976871725 +0000 UTC m=+177.983045931" Mar 18 15:38:54 crc kubenswrapper[4696]: I0318 15:38:54.994400 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:54 crc kubenswrapper[4696]: E0318 15:38:54.996770 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:55.496756685 +0000 UTC m=+178.502930891 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.029170 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-vgrlm" podStartSLOduration=6.029150041 podStartE2EDuration="6.029150041s" podCreationTimestamp="2026-03-18 15:38:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:55.028920445 +0000 UTC m=+178.035094661" watchObservedRunningTime="2026-03-18 15:38:55.029150041 +0000 UTC m=+178.035324247" Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.100605 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:55 crc kubenswrapper[4696]: E0318 15:38:55.101324 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:55.601303586 +0000 UTC m=+178.607477792 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.115721 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cr7hv" podStartSLOduration=131.115700119 podStartE2EDuration="2m11.115700119s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:55.114716724 +0000 UTC m=+178.120890940" watchObservedRunningTime="2026-03-18 15:38:55.115700119 +0000 UTC m=+178.121874335" Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.119794 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4qhl8" podStartSLOduration=130.119774411 podStartE2EDuration="2m10.119774411s" podCreationTimestamp="2026-03-18 15:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:55.068199523 +0000 UTC m=+178.074373759" watchObservedRunningTime="2026-03-18 15:38:55.119774411 +0000 UTC m=+178.125948637" Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.148651 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2"] Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.167013 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g5mnq" podStartSLOduration=131.16698518 podStartE2EDuration="2m11.16698518s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:55.138857782 +0000 UTC m=+178.145031998" watchObservedRunningTime="2026-03-18 15:38:55.16698518 +0000 UTC m=+178.173159386" Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.193943 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564130-4tdt6"] Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.199306 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ff562"] Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.199605 4696 patch_prober.go:28] interesting pod/router-default-5444994796-xk9kb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:38:55 crc kubenswrapper[4696]: [-]has-synced failed: reason withheld Mar 18 15:38:55 crc kubenswrapper[4696]: [+]process-running ok Mar 18 15:38:55 crc kubenswrapper[4696]: healthz check failed Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.199768 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xk9kb" podUID="94443ebd-69c4-4f6b-90a6-13cd2da51741" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.202871 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:55 crc kubenswrapper[4696]: E0318 15:38:55.203435 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:55.703404476 +0000 UTC m=+178.709578682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.207857 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4"] Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.208295 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdd4w" podStartSLOduration=131.208278919 podStartE2EDuration="2m11.208278919s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:55.170456607 +0000 UTC m=+178.176630813" watchObservedRunningTime="2026-03-18 15:38:55.208278919 +0000 UTC m=+178.214453115" Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.209085 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-498px" podStartSLOduration=131.209080559 podStartE2EDuration="2m11.209080559s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:55.202398261 +0000 UTC m=+178.208572467" watchObservedRunningTime="2026-03-18 15:38:55.209080559 +0000 UTC m=+178.215254765" Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.236957 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gx9sb"] Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.238988 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-x2rv5"] Mar 18 15:38:55 crc kubenswrapper[4696]: W0318 15:38:55.293952 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0297eb3_0438_4db1_97bd_405779a01255.slice/crio-1965617b37ce91de17ba25ce17ca931e18b0d47ae05e8823ae510980560900b2 WatchSource:0}: Error finding container 1965617b37ce91de17ba25ce17ca931e18b0d47ae05e8823ae510980560900b2: Status 404 returned error can't find the container with id 1965617b37ce91de17ba25ce17ca931e18b0d47ae05e8823ae510980560900b2 Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.305965 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:55 crc kubenswrapper[4696]: E0318 15:38:55.306397 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:55.806383828 +0000 UTC m=+178.812558034 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:55 crc kubenswrapper[4696]: W0318 15:38:55.306837 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f523e3f_cda8_49a8_871f_06d20ce5834e.slice/crio-8509da4305b68fbd0f0d54132f64c45891b368a06de0f2dc8cb282890aa0d21a WatchSource:0}: Error finding container 8509da4305b68fbd0f0d54132f64c45891b368a06de0f2dc8cb282890aa0d21a: Status 404 returned error can't find the container with id 8509da4305b68fbd0f0d54132f64c45891b368a06de0f2dc8cb282890aa0d21a Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.316019 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9wfks"] Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.335314 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4bxnr"] Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.356856 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2x748"] Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.397006 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vpp52"] Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.410126 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:55 crc kubenswrapper[4696]: E0318 15:38:55.410546 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:55.910511418 +0000 UTC m=+178.916685614 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.417633 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-69fs4"] Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.423098 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5ljxk"] Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.432085 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5r8lf"] Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.449285 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r4wjj"] Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.455104 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ljcg6"] Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.460041 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dshv6"] Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.482569 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vq9f"] Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.486873 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8286f"] Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.489319 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-m4tfr"] Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.495468 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rgwf8"] Mar 18 15:38:55 crc kubenswrapper[4696]: W0318 15:38:55.503503 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55f95b82_6e61_4b39_a2e6_6685d37d61e8.slice/crio-82b719e9ab039008ea2afe99f1b272b0092b804ee335b8a8e949a81ac294aa41 WatchSource:0}: Error finding container 82b719e9ab039008ea2afe99f1b272b0092b804ee335b8a8e949a81ac294aa41: Status 404 returned error can't find the container with id 82b719e9ab039008ea2afe99f1b272b0092b804ee335b8a8e949a81ac294aa41 Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.511226 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:55 crc kubenswrapper[4696]: E0318 15:38:55.511498 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:56.011472769 +0000 UTC m=+179.017646975 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.511812 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:55 crc kubenswrapper[4696]: E0318 15:38:55.512179 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:56.012166037 +0000 UTC m=+179.018340243 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:55 crc kubenswrapper[4696]: W0318 15:38:55.557059 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4224061_726c_4bee_84d0_9b1dfddcdaa4.slice/crio-2f515891383010cd584ca85dec8eef1b0cf4225c35ee686fb45cb40f7bd222da WatchSource:0}: Error finding container 2f515891383010cd584ca85dec8eef1b0cf4225c35ee686fb45cb40f7bd222da: Status 404 returned error can't find the container with id 2f515891383010cd584ca85dec8eef1b0cf4225c35ee686fb45cb40f7bd222da Mar 18 15:38:55 crc kubenswrapper[4696]: W0318 15:38:55.592109 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod405ef230_1a28_4605_af3a_d3c7242943ce.slice/crio-cf7a649c6ce7d5380c165c26cba3c80115997568b1902ae7fda8c79a7fb46240 WatchSource:0}: Error finding container cf7a649c6ce7d5380c165c26cba3c80115997568b1902ae7fda8c79a7fb46240: Status 404 returned error can't find the container with id cf7a649c6ce7d5380c165c26cba3c80115997568b1902ae7fda8c79a7fb46240 Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.624296 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:55 crc kubenswrapper[4696]: E0318 15:38:55.624817 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:56.124787791 +0000 UTC m=+179.130961997 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.759011 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.759158 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mcsh8"] Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.759256 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564138-jzn9j"] Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.759250 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:55 crc kubenswrapper[4696]: E0318 15:38:55.759569 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:56.259556333 +0000 UTC m=+179.265730539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.759573 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7p8b"] Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.759811 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9cb8"] Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.773897 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2"] Mar 18 15:38:55 crc kubenswrapper[4696]: W0318 15:38:55.805474 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd651c778_1304_44b3_b21a_4c3ba2158db4.slice/crio-403185cef999edc11600d46a48c6dd0af2cbf68077d0663c283ce8444abc5565 WatchSource:0}: Error finding container 403185cef999edc11600d46a48c6dd0af2cbf68077d0663c283ce8444abc5565: Status 404 returned error can't find the container with id 403185cef999edc11600d46a48c6dd0af2cbf68077d0663c283ce8444abc5565 Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.866332 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:55 crc kubenswrapper[4696]: E0318 15:38:55.867217 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:56.367189962 +0000 UTC m=+179.373364168 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.867425 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:55 crc kubenswrapper[4696]: E0318 15:38:55.867724 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:56.367716455 +0000 UTC m=+179.373890661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:55 crc kubenswrapper[4696]: W0318 15:38:55.884088 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05a01af8_3578_44e1_8398_00e57ae2c4c2.slice/crio-2cc3225391280bf64d66f8b09c70864ba7eed87ff050948e5e374e540deeef3a WatchSource:0}: Error finding container 2cc3225391280bf64d66f8b09c70864ba7eed87ff050948e5e374e540deeef3a: Status 404 returned error can't find the container with id 2cc3225391280bf64d66f8b09c70864ba7eed87ff050948e5e374e540deeef3a Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.954027 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-9wfks" event={"ID":"c38fdccb-264a-4d02-9f5c-19140a3df2f2","Type":"ContainerStarted","Data":"4f3a4c1c641dab0d8ec10a99f6788a66eb653e2d3a3c43d354bac168e57c3913"} Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.958297 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ljcg6" event={"ID":"1a9512a6-dc1a-4c12-b080-55aeb712b1f0","Type":"ContainerStarted","Data":"dc08eab65fbe32ea7dd6e18a60de1147d1e7cad5d437f3007031c9a4bfb11d14"} Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.969954 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:55 crc kubenswrapper[4696]: E0318 15:38:55.970310 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:56.470288737 +0000 UTC m=+179.476462943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.972655 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" event={"ID":"cb247c8f-8684-477b-8c7a-4a222474a497","Type":"ContainerStarted","Data":"af68799085a6e9d541e38e79313d1f558bb881964afb2d057ffb0a0d25748ca4"} Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.989784 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ff562" event={"ID":"7e889bb6-3e2c-4131-a1f7-bca28b996cd3","Type":"ContainerStarted","Data":"e0dd1607fd0c2b32692435e2928e63ed27f638c4b80e45d71b48cb504564cef0"} Mar 18 15:38:55 crc kubenswrapper[4696]: I0318 15:38:55.989838 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ff562" event={"ID":"7e889bb6-3e2c-4131-a1f7-bca28b996cd3","Type":"ContainerStarted","Data":"ec91723c24271e62f4eefe624098b1d57722e453afc3f1e699a05942fbc15a74"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.004747 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.026841 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-ff562" podStartSLOduration=131.02682436 podStartE2EDuration="2m11.02682436s" podCreationTimestamp="2026-03-18 15:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:56.025022424 +0000 UTC m=+179.031196640" watchObservedRunningTime="2026-03-18 15:38:56.02682436 +0000 UTC m=+179.032998566" Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.028614 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" event={"ID":"363b3949-b8a3-4fd4-a13d-281d29c61822","Type":"ContainerStarted","Data":"317eed48e8dd6a9bd4aa83d919328d1c7a0664c401a2866baf26e09142209c5d"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.040544 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rgwf8" event={"ID":"d651c778-1304-44b3-b21a-4c3ba2158db4","Type":"ContainerStarted","Data":"403185cef999edc11600d46a48c6dd0af2cbf68077d0663c283ce8444abc5565"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.075362 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:56 crc kubenswrapper[4696]: E0318 15:38:56.077200 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:56.577186297 +0000 UTC m=+179.583360503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.078253 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vq9f" event={"ID":"c59c714a-1145-4630-9d30-24d15369e2b6","Type":"ContainerStarted","Data":"e7c17a3b9818be0793a284dc8040cb81c7fdcca8490d6f833bf36a2e3f06c475"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.119241 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rksxf" event={"ID":"df6fa2da-47ec-4a8f-b12b-59c640e0a361","Type":"ContainerStarted","Data":"775da4f075b79ecfcaf48a322a15a1131f40bcb2e16eda7dc61f86332bcfaf64"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.155772 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rksxf" podStartSLOduration=132.155751604 podStartE2EDuration="2m12.155751604s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:56.154663777 +0000 UTC m=+179.160837983" watchObservedRunningTime="2026-03-18 15:38:56.155751604 +0000 UTC m=+179.161925810" Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.166687 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n7svq" event={"ID":"914a7337-c621-43a6-8294-50390f768e28","Type":"ContainerStarted","Data":"4ca99d66d409e47e1e97bdbea8837020eeee1beea2b54e3f418690f96a6c14c6"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.174656 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r4wjj" event={"ID":"e4224061-726c-4bee-84d0-9b1dfddcdaa4","Type":"ContainerStarted","Data":"2f515891383010cd584ca85dec8eef1b0cf4225c35ee686fb45cb40f7bd222da"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.179874 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:56 crc kubenswrapper[4696]: E0318 15:38:56.181265 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:56.681249696 +0000 UTC m=+179.687423902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.187845 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9cb8" event={"ID":"05a01af8-3578-44e1-8398-00e57ae2c4c2","Type":"ContainerStarted","Data":"2cc3225391280bf64d66f8b09c70864ba7eed87ff050948e5e374e540deeef3a"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.198346 4696 patch_prober.go:28] interesting pod/router-default-5444994796-xk9kb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:38:56 crc kubenswrapper[4696]: [-]has-synced failed: reason withheld Mar 18 15:38:56 crc kubenswrapper[4696]: [+]process-running ok Mar 18 15:38:56 crc kubenswrapper[4696]: healthz check failed Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.198408 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xk9kb" podUID="94443ebd-69c4-4f6b-90a6-13cd2da51741" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.207171 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2x748" event={"ID":"b2dbc3c5-555b-4b76-af97-bbe1de318efe","Type":"ContainerStarted","Data":"a03d8d6e1c9215c8636a3cf6b2c31eeabb0c44d683ec8fe330bf63f310723545"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.224599 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4tfr" event={"ID":"ecd7ad5d-edd9-4ae4-8b42-cc17562d6182","Type":"ContainerStarted","Data":"6a679f7eda6751b6b2a5b1afb4b6e1619e7180736a273c2641f40e5b29535f8d"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.234923 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-6xjgb" event={"ID":"0c05edf1-5079-4212-ba5c-19621b2500cf","Type":"ContainerStarted","Data":"fdae12b4492c464e2934447a8f01e30595df57290e8df0d0dad309693f36e37b"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.234965 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-6xjgb" event={"ID":"0c05edf1-5079-4212-ba5c-19621b2500cf","Type":"ContainerStarted","Data":"fe5fd356ccbf5ac74c5d37b88452b8eb8140d078b8be2d6ea44fa26090cda4d5"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.234976 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-6xjgb" event={"ID":"0c05edf1-5079-4212-ba5c-19621b2500cf","Type":"ContainerStarted","Data":"d1f095e6c06f44f84e0a590a2dcd7d5568201925db24cf4b71dfc226dd108ab0"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.266136 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n7svq" podStartSLOduration=131.266121322 podStartE2EDuration="2m11.266121322s" podCreationTimestamp="2026-03-18 15:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:56.200651334 +0000 UTC m=+179.206825540" watchObservedRunningTime="2026-03-18 15:38:56.266121322 +0000 UTC m=+179.272295528" Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.266474 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-6xjgb" podStartSLOduration=131.266468271 podStartE2EDuration="2m11.266468271s" podCreationTimestamp="2026-03-18 15:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:56.265081696 +0000 UTC m=+179.271255902" watchObservedRunningTime="2026-03-18 15:38:56.266468271 +0000 UTC m=+179.272642477" Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.276907 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gx9sb" event={"ID":"b0297eb3-0438-4db1-97bd-405779a01255","Type":"ContainerStarted","Data":"1965617b37ce91de17ba25ce17ca931e18b0d47ae05e8823ae510980560900b2"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.281301 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:56 crc kubenswrapper[4696]: E0318 15:38:56.282446 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:56.782433793 +0000 UTC m=+179.788607999 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.297392 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5r8lf" event={"ID":"53d14281-ca5f-4420-a56f-a9fd192c7e58","Type":"ContainerStarted","Data":"025372cdc9081bdce162890e13964fec58ddcecf32a35e2fb6d8b61778ae7724"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.312562 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gx9sb" podStartSLOduration=131.31254472 podStartE2EDuration="2m11.31254472s" podCreationTimestamp="2026-03-18 15:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:56.310796346 +0000 UTC m=+179.316970552" watchObservedRunningTime="2026-03-18 15:38:56.31254472 +0000 UTC m=+179.318718926" Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.343660 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-4tdt6" event={"ID":"8339464b-c883-44d7-95eb-57c32689e91b","Type":"ContainerStarted","Data":"e3f8d242f60c1385fcd94796dad9cc2ca4dab05bff3bd93122f0a873f540d475"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.343725 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-4tdt6" event={"ID":"8339464b-c883-44d7-95eb-57c32689e91b","Type":"ContainerStarted","Data":"3254ebcc185b176f4c788783924b8002c5de7929ec97fa4a5ca4acd553f203d5"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.360299 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8286f" event={"ID":"10213877-0b4a-463d-b286-f2a0fb2e3fd6","Type":"ContainerStarted","Data":"64b91384348a46656ff9f4a022a93ff154e47493880b4e7bfd16e4f790e1af54"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.382103 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-4tdt6" podStartSLOduration=132.382084851 podStartE2EDuration="2m12.382084851s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:56.380928822 +0000 UTC m=+179.387103028" watchObservedRunningTime="2026-03-18 15:38:56.382084851 +0000 UTC m=+179.388259057" Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.383120 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5ljxk" event={"ID":"e00547a5-b2bd-495e-9d84-d6d6162bf42b","Type":"ContainerStarted","Data":"63a6aa3b0fe8c71d2cb453bdb4121e4c9fd783dd72305b56b90f9595d64a3760"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.383675 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:56 crc kubenswrapper[4696]: E0318 15:38:56.384128 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:56.884109181 +0000 UTC m=+179.890283387 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.399945 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7p8b" event={"ID":"5f9e19eb-eadb-478e-b46d-f717a7a7c3de","Type":"ContainerStarted","Data":"276245f5a95a96306c28ac0b3f3a403090105fcc59c0591194950257e0440d10"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.446508 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-x2rv5" event={"ID":"6f523e3f-cda8-49a8-871f-06d20ce5834e","Type":"ContainerStarted","Data":"8509da4305b68fbd0f0d54132f64c45891b368a06de0f2dc8cb282890aa0d21a"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.459040 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-69fs4" event={"ID":"55f95b82-6e61-4b39-a2e6-6685d37d61e8","Type":"ContainerStarted","Data":"82b719e9ab039008ea2afe99f1b272b0092b804ee335b8a8e949a81ac294aa41"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.482698 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2" event={"ID":"a6eeff11-0e98-48f5-a868-a7016d57be14","Type":"ContainerStarted","Data":"e54d51ea6438a8f7f5f97945eb1599978034f9ea852d099d953078fb63040f56"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.482761 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2" event={"ID":"a6eeff11-0e98-48f5-a868-a7016d57be14","Type":"ContainerStarted","Data":"9c412c24c408897ebed5fb2b8b1d1c785149ebce78a181f37947ed81b311cbf6"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.482899 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2" podUID="a6eeff11-0e98-48f5-a868-a7016d57be14" containerName="route-controller-manager" containerID="cri-o://e54d51ea6438a8f7f5f97945eb1599978034f9ea852d099d953078fb63040f56" gracePeriod=30 Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.483731 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2" Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.492478 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:56 crc kubenswrapper[4696]: E0318 15:38:56.492936 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:56.99291276 +0000 UTC m=+179.999087026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.498696 4696 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-kn2g2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.498776 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2" podUID="a6eeff11-0e98-48f5-a868-a7016d57be14" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.512732 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2" podStartSLOduration=131.512708388 podStartE2EDuration="2m11.512708388s" podCreationTimestamp="2026-03-18 15:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:56.51080306 +0000 UTC m=+179.516977286" watchObservedRunningTime="2026-03-18 15:38:56.512708388 +0000 UTC m=+179.518882594" Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.543493 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dshv6" event={"ID":"405ef230-1a28-4605-af3a-d3c7242943ce","Type":"ContainerStarted","Data":"cf7a649c6ce7d5380c165c26cba3c80115997568b1902ae7fda8c79a7fb46240"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.570135 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vpp52" event={"ID":"90a9c50c-4a8b-4731-9ee8-addc5dfb22f7","Type":"ContainerStarted","Data":"18d95fd5a05005d0349a62b346a1365fdaef2b67d9253cbf4660b7e8511921d7"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.588314 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" event={"ID":"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e","Type":"ContainerStarted","Data":"aa81cc4c381eb53b4a91e0526fff9f81381bf2f542dbfb06dc6c543339b9283e"} Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.590921 4696 patch_prober.go:28] interesting pod/downloads-7954f5f757-498px container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.590975 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-498px" podUID="f7245cec-cee9-4aa5-8087-81ad2f450977" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.619385 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vpp52" podStartSLOduration=132.619360992 podStartE2EDuration="2m12.619360992s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:56.604888558 +0000 UTC m=+179.611062764" watchObservedRunningTime="2026-03-18 15:38:56.619360992 +0000 UTC m=+179.625535188" Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.657699 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:56 crc kubenswrapper[4696]: E0318 15:38:56.658690 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:57.158668961 +0000 UTC m=+180.164843167 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.658818 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:56 crc kubenswrapper[4696]: E0318 15:38:56.665285 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:57.165269107 +0000 UTC m=+180.171443313 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.764818 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:56 crc kubenswrapper[4696]: E0318 15:38:56.765112 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:57.265097449 +0000 UTC m=+180.271271655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.873594 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:56 crc kubenswrapper[4696]: E0318 15:38:56.876822 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:57.37680232 +0000 UTC m=+180.382976706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.948662 4696 ???:1] "http: TLS handshake error from 192.168.126.11:33264: no serving certificate available for the kubelet" Mar 18 15:38:56 crc kubenswrapper[4696]: I0318 15:38:56.974794 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:56 crc kubenswrapper[4696]: E0318 15:38:56.975096 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:57.475080244 +0000 UTC m=+180.481254450 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.033240 4696 ???:1] "http: TLS handshake error from 192.168.126.11:33276: no serving certificate available for the kubelet" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.078247 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:57 crc kubenswrapper[4696]: E0318 15:38:57.078619 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:57.578606699 +0000 UTC m=+180.584780905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.088867 4696 ???:1] "http: TLS handshake error from 192.168.126.11:33278: no serving certificate available for the kubelet" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.154901 4696 ???:1] "http: TLS handshake error from 192.168.126.11:33284: no serving certificate available for the kubelet" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.179678 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:57 crc kubenswrapper[4696]: E0318 15:38:57.180015 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:57.679994741 +0000 UTC m=+180.686168947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.192153 4696 patch_prober.go:28] interesting pod/router-default-5444994796-xk9kb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:38:57 crc kubenswrapper[4696]: [-]has-synced failed: reason withheld Mar 18 15:38:57 crc kubenswrapper[4696]: [+]process-running ok Mar 18 15:38:57 crc kubenswrapper[4696]: healthz check failed Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.192224 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xk9kb" podUID="94443ebd-69c4-4f6b-90a6-13cd2da51741" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.258491 4696 ???:1] "http: TLS handshake error from 192.168.126.11:33290: no serving certificate available for the kubelet" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.282311 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:57 crc kubenswrapper[4696]: E0318 15:38:57.282756 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:57.782741837 +0000 UTC m=+180.788916043 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.339349 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6576b87f9c-kn2g2_a6eeff11-0e98-48f5-a868-a7016d57be14/route-controller-manager/0.log" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.339411 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.383618 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:57 crc kubenswrapper[4696]: E0318 15:38:57.384047 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:57.884019716 +0000 UTC m=+180.890193922 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.386273 4696 ???:1] "http: TLS handshake error from 192.168.126.11:33306: no serving certificate available for the kubelet" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.388329 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk"] Mar 18 15:38:57 crc kubenswrapper[4696]: E0318 15:38:57.388586 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6eeff11-0e98-48f5-a868-a7016d57be14" containerName="route-controller-manager" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.388600 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6eeff11-0e98-48f5-a868-a7016d57be14" containerName="route-controller-manager" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.388707 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6eeff11-0e98-48f5-a868-a7016d57be14" containerName="route-controller-manager" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.389081 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.409415 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk"] Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.485441 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6eeff11-0e98-48f5-a868-a7016d57be14-serving-cert\") pod \"a6eeff11-0e98-48f5-a868-a7016d57be14\" (UID: \"a6eeff11-0e98-48f5-a868-a7016d57be14\") " Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.485541 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6eeff11-0e98-48f5-a868-a7016d57be14-config\") pod \"a6eeff11-0e98-48f5-a868-a7016d57be14\" (UID: \"a6eeff11-0e98-48f5-a868-a7016d57be14\") " Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.485676 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwglp\" (UniqueName: \"kubernetes.io/projected/a6eeff11-0e98-48f5-a868-a7016d57be14-kube-api-access-jwglp\") pod \"a6eeff11-0e98-48f5-a868-a7016d57be14\" (UID: \"a6eeff11-0e98-48f5-a868-a7016d57be14\") " Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.485736 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6eeff11-0e98-48f5-a868-a7016d57be14-client-ca\") pod \"a6eeff11-0e98-48f5-a868-a7016d57be14\" (UID: \"a6eeff11-0e98-48f5-a868-a7016d57be14\") " Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.485861 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:57 crc kubenswrapper[4696]: E0318 15:38:57.486247 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:57.986234298 +0000 UTC m=+180.992408504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.487282 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6eeff11-0e98-48f5-a868-a7016d57be14-config" (OuterVolumeSpecName: "config") pod "a6eeff11-0e98-48f5-a868-a7016d57be14" (UID: "a6eeff11-0e98-48f5-a868-a7016d57be14"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.488898 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6eeff11-0e98-48f5-a868-a7016d57be14-client-ca" (OuterVolumeSpecName: "client-ca") pod "a6eeff11-0e98-48f5-a868-a7016d57be14" (UID: "a6eeff11-0e98-48f5-a868-a7016d57be14"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.511628 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6eeff11-0e98-48f5-a868-a7016d57be14-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a6eeff11-0e98-48f5-a868-a7016d57be14" (UID: "a6eeff11-0e98-48f5-a868-a7016d57be14"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.512381 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6eeff11-0e98-48f5-a868-a7016d57be14-kube-api-access-jwglp" (OuterVolumeSpecName: "kube-api-access-jwglp") pod "a6eeff11-0e98-48f5-a868-a7016d57be14" (UID: "a6eeff11-0e98-48f5-a868-a7016d57be14"). InnerVolumeSpecName "kube-api-access-jwglp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.590866 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.591390 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb66a6d0-42a1-4e17-8478-17f0b32a2369-config\") pod \"route-controller-manager-5cf487b88-2zcgk\" (UID: \"eb66a6d0-42a1-4e17-8478-17f0b32a2369\") " pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.591417 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb66a6d0-42a1-4e17-8478-17f0b32a2369-serving-cert\") pod \"route-controller-manager-5cf487b88-2zcgk\" (UID: \"eb66a6d0-42a1-4e17-8478-17f0b32a2369\") " pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.591449 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dz2v\" (UniqueName: \"kubernetes.io/projected/eb66a6d0-42a1-4e17-8478-17f0b32a2369-kube-api-access-9dz2v\") pod \"route-controller-manager-5cf487b88-2zcgk\" (UID: \"eb66a6d0-42a1-4e17-8478-17f0b32a2369\") " pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.591465 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb66a6d0-42a1-4e17-8478-17f0b32a2369-client-ca\") pod \"route-controller-manager-5cf487b88-2zcgk\" (UID: \"eb66a6d0-42a1-4e17-8478-17f0b32a2369\") " pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.591587 4696 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6eeff11-0e98-48f5-a868-a7016d57be14-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.591600 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6eeff11-0e98-48f5-a868-a7016d57be14-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.591610 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6eeff11-0e98-48f5-a868-a7016d57be14-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.591619 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwglp\" (UniqueName: \"kubernetes.io/projected/a6eeff11-0e98-48f5-a868-a7016d57be14-kube-api-access-jwglp\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:57 crc kubenswrapper[4696]: E0318 15:38:57.591686 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:58.091673182 +0000 UTC m=+181.097847388 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.612174 4696 ???:1] "http: TLS handshake error from 192.168.126.11:33322: no serving certificate available for the kubelet" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.693057 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb66a6d0-42a1-4e17-8478-17f0b32a2369-serving-cert\") pod \"route-controller-manager-5cf487b88-2zcgk\" (UID: \"eb66a6d0-42a1-4e17-8478-17f0b32a2369\") " pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.693116 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dz2v\" (UniqueName: \"kubernetes.io/projected/eb66a6d0-42a1-4e17-8478-17f0b32a2369-kube-api-access-9dz2v\") pod \"route-controller-manager-5cf487b88-2zcgk\" (UID: \"eb66a6d0-42a1-4e17-8478-17f0b32a2369\") " pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.693142 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb66a6d0-42a1-4e17-8478-17f0b32a2369-client-ca\") pod \"route-controller-manager-5cf487b88-2zcgk\" (UID: \"eb66a6d0-42a1-4e17-8478-17f0b32a2369\") " pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.693201 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.693285 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb66a6d0-42a1-4e17-8478-17f0b32a2369-config\") pod \"route-controller-manager-5cf487b88-2zcgk\" (UID: \"eb66a6d0-42a1-4e17-8478-17f0b32a2369\") " pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" Mar 18 15:38:57 crc kubenswrapper[4696]: E0318 15:38:57.696948 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:58.196932821 +0000 UTC m=+181.203107027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.711769 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb66a6d0-42a1-4e17-8478-17f0b32a2369-config\") pod \"route-controller-manager-5cf487b88-2zcgk\" (UID: \"eb66a6d0-42a1-4e17-8478-17f0b32a2369\") " pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.718905 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb66a6d0-42a1-4e17-8478-17f0b32a2369-client-ca\") pod \"route-controller-manager-5cf487b88-2zcgk\" (UID: \"eb66a6d0-42a1-4e17-8478-17f0b32a2369\") " pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.720893 4696 ???:1] "http: TLS handshake error from 192.168.126.11:33332: no serving certificate available for the kubelet" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.748822 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb66a6d0-42a1-4e17-8478-17f0b32a2369-serving-cert\") pod \"route-controller-manager-5cf487b88-2zcgk\" (UID: \"eb66a6d0-42a1-4e17-8478-17f0b32a2369\") " pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.778594 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dz2v\" (UniqueName: \"kubernetes.io/projected/eb66a6d0-42a1-4e17-8478-17f0b32a2369-kube-api-access-9dz2v\") pod \"route-controller-manager-5cf487b88-2zcgk\" (UID: \"eb66a6d0-42a1-4e17-8478-17f0b32a2369\") " pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.782879 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.794099 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vpp52" event={"ID":"90a9c50c-4a8b-4731-9ee8-addc5dfb22f7","Type":"ContainerStarted","Data":"f1621b4cb3f2b362836d2452f186fe357329e8dd07957e097883e7b88ed210d8"} Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.794754 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:57 crc kubenswrapper[4696]: E0318 15:38:57.794867 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:58.294837985 +0000 UTC m=+181.301012191 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.795100 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:57 crc kubenswrapper[4696]: E0318 15:38:57.795500 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:58.295489601 +0000 UTC m=+181.301663807 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.867966 4696 generic.go:334] "Generic (PLEG): container finished" podID="f3bc2762-13c0-4c5b-b3c9-af4f68e1858e" containerID="73503989ce81fe6591594d4cd1ed426879ad48c5af0e99df3b9d909f44641ea9" exitCode=0 Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.868039 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" event={"ID":"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e","Type":"ContainerDied","Data":"73503989ce81fe6591594d4cd1ed426879ad48c5af0e99df3b9d909f44641ea9"} Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.899045 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:57 crc kubenswrapper[4696]: E0318 15:38:57.899551 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:58.399509089 +0000 UTC m=+181.405683295 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.899857 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:57 crc kubenswrapper[4696]: E0318 15:38:57.900886 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:58.400874054 +0000 UTC m=+181.407048260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.951902 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ljcg6" event={"ID":"1a9512a6-dc1a-4c12-b080-55aeb712b1f0","Type":"ContainerStarted","Data":"57b788a5ffd08bc191945f1a25d92b94c07aacffaf0a256822785dbe359ad5e7"} Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.951963 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ljcg6" event={"ID":"1a9512a6-dc1a-4c12-b080-55aeb712b1f0","Type":"ContainerStarted","Data":"62d2797e6a7f66c13517aaa25707667278abfd114c5857dd21351a5490d38f8c"} Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.978758 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5r8lf" event={"ID":"53d14281-ca5f-4420-a56f-a9fd192c7e58","Type":"ContainerStarted","Data":"80c72b0d80484e08712fcda32302919bd1d5654bfc8946aa934b57428e2b2a53"} Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.979112 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5r8lf" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.980329 4696 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-5r8lf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.980369 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5r8lf" podUID="53d14281-ca5f-4420-a56f-a9fd192c7e58" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.981918 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g5mnq" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.994057 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5ljxk" event={"ID":"e00547a5-b2bd-495e-9d84-d6d6162bf42b","Type":"ContainerStarted","Data":"231dcece466db76a1cae124351f8fe5bd4177004d5cc2accf61171f0056f8d9f"} Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.995274 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5ljxk" Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.997187 4696 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-5ljxk container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 18 15:38:57 crc kubenswrapper[4696]: I0318 15:38:57.998080 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5ljxk" podUID="e00547a5-b2bd-495e-9d84-d6d6162bf42b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.001918 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:58 crc kubenswrapper[4696]: E0318 15:38:58.002084 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:58.50206176 +0000 UTC m=+181.508235966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.002284 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:58 crc kubenswrapper[4696]: E0318 15:38:58.003963 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:58.503946648 +0000 UTC m=+181.510121054 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.016991 4696 ???:1] "http: TLS handshake error from 192.168.126.11:33336: no serving certificate available for the kubelet" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.018504 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2x748" event={"ID":"b2dbc3c5-555b-4b76-af97-bbe1de318efe","Type":"ContainerStarted","Data":"c33ca7c3a16e75724ffcaf33ed69c39d1f7c7aa1e3a4fbeb887c6479f8c64c2a"} Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.041249 4696 generic.go:334] "Generic (PLEG): container finished" podID="a6eeff11-0e98-48f5-a868-a7016d57be14" containerID="e54d51ea6438a8f7f5f97945eb1599978034f9ea852d099d953078fb63040f56" exitCode=2 Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.041398 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.041934 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2" event={"ID":"a6eeff11-0e98-48f5-a868-a7016d57be14","Type":"ContainerDied","Data":"e54d51ea6438a8f7f5f97945eb1599978034f9ea852d099d953078fb63040f56"} Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.041964 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2" event={"ID":"a6eeff11-0e98-48f5-a868-a7016d57be14","Type":"ContainerDied","Data":"9c412c24c408897ebed5fb2b8b1d1c785149ebce78a181f37947ed81b311cbf6"} Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.041981 4696 scope.go:117] "RemoveContainer" containerID="e54d51ea6438a8f7f5f97945eb1599978034f9ea852d099d953078fb63040f56" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.061908 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" event={"ID":"363b3949-b8a3-4fd4-a13d-281d29c61822","Type":"ContainerStarted","Data":"f1722447dfd16f4ebe9ed0fb9fcbb63f8903ae4e44f8a2e4bb80b6a693c8ae3d"} Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.087631 4696 scope.go:117] "RemoveContainer" containerID="e54d51ea6438a8f7f5f97945eb1599978034f9ea852d099d953078fb63040f56" Mar 18 15:38:58 crc kubenswrapper[4696]: E0318 15:38:58.098942 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e54d51ea6438a8f7f5f97945eb1599978034f9ea852d099d953078fb63040f56\": container with ID starting with e54d51ea6438a8f7f5f97945eb1599978034f9ea852d099d953078fb63040f56 not found: ID does not exist" containerID="e54d51ea6438a8f7f5f97945eb1599978034f9ea852d099d953078fb63040f56" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.099361 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e54d51ea6438a8f7f5f97945eb1599978034f9ea852d099d953078fb63040f56"} err="failed to get container status \"e54d51ea6438a8f7f5f97945eb1599978034f9ea852d099d953078fb63040f56\": rpc error: code = NotFound desc = could not find container \"e54d51ea6438a8f7f5f97945eb1599978034f9ea852d099d953078fb63040f56\": container with ID starting with e54d51ea6438a8f7f5f97945eb1599978034f9ea852d099d953078fb63040f56 not found: ID does not exist" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.105738 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:58 crc kubenswrapper[4696]: E0318 15:38:58.105831 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:58.605808391 +0000 UTC m=+181.611982597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.106222 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:58 crc kubenswrapper[4696]: E0318 15:38:58.107628 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:58.607619377 +0000 UTC m=+181.613793583 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.112065 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8286f" event={"ID":"10213877-0b4a-463d-b286-f2a0fb2e3fd6","Type":"ContainerStarted","Data":"dc447631136e571e7d1c80d62186acf7a8b32d2574aa83d5ce6399e0754e9e3a"} Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.128351 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dshv6" event={"ID":"405ef230-1a28-4605-af3a-d3c7242943ce","Type":"ContainerStarted","Data":"2674291d4466416c442103c9869bb904466d859fc7499923cee430ab4156772d"} Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.133884 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-x2rv5" event={"ID":"6f523e3f-cda8-49a8-871f-06d20ce5834e","Type":"ContainerStarted","Data":"66f37c6a169b49aa7158d113505f9508dc7f161463a8122cd527c087ebf03e61"} Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.143760 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gx9sb" event={"ID":"b0297eb3-0438-4db1-97bd-405779a01255","Type":"ContainerStarted","Data":"4a785e55a0b91671371bbb9bfa0bad5f6c5cc4379b5467c4a6ec384f5054d52d"} Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.156197 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r4wjj" event={"ID":"e4224061-726c-4bee-84d0-9b1dfddcdaa4","Type":"ContainerStarted","Data":"ea5930774ab4c5fe5ce9ef076acc7e45c37308db4c11faee50be36a955bf26d5"} Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.157634 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-r4wjj" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.163192 4696 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-r4wjj container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.163242 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-r4wjj" podUID="e4224061-726c-4bee-84d0-9b1dfddcdaa4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.188079 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7p8b" event={"ID":"5f9e19eb-eadb-478e-b46d-f717a7a7c3de","Type":"ContainerStarted","Data":"64d785460fde68fbbe9b867321c8c8ab5b5f2f70d8d8732038fa5762719e282b"} Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.189414 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7p8b" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.205470 4696 patch_prober.go:28] interesting pod/router-default-5444994796-xk9kb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:38:58 crc kubenswrapper[4696]: [-]has-synced failed: reason withheld Mar 18 15:38:58 crc kubenswrapper[4696]: [+]process-running ok Mar 18 15:38:58 crc kubenswrapper[4696]: healthz check failed Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.205511 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xk9kb" podUID="94443ebd-69c4-4f6b-90a6-13cd2da51741" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.209018 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:58 crc kubenswrapper[4696]: E0318 15:38:58.211317 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:58.711297116 +0000 UTC m=+181.717471322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.219949 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4tfr" event={"ID":"ecd7ad5d-edd9-4ae4-8b42-cc17562d6182","Type":"ContainerStarted","Data":"7d56165a40802d04c4716559cc39130efe64c29f24eacc7532e89fe09e15e03a"} Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.223060 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-9wfks" event={"ID":"c38fdccb-264a-4d02-9f5c-19140a3df2f2","Type":"ContainerStarted","Data":"55ce88fa7b2c6633010056b250c7d470373268d799f1c3d7bfd1cd519bf82442"} Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.224249 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-9wfks" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.227975 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564138-jzn9j" event={"ID":"854ec8b0-a321-4bcb-9327-96742fec3f31","Type":"ContainerStarted","Data":"59b2ebb1d33687b3a8829d25cccde1f6c5a6ed3e1f29c40427879c8a33aaeae3"} Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.243285 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-69fs4" event={"ID":"55f95b82-6e61-4b39-a2e6-6685d37d61e8","Type":"ContainerStarted","Data":"4acf53a37ec7257d0cbc03d40e7ee0de2af4de2141e99e304f7af0f34e01f80a"} Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.258902 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9cb8" event={"ID":"05a01af8-3578-44e1-8398-00e57ae2c4c2","Type":"ContainerStarted","Data":"17c2c8d342eea9479f14e176cfe9cc3fca8c70671f619c6117bd054490ae5c94"} Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.307368 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rgwf8" event={"ID":"d651c778-1304-44b3-b21a-4c3ba2158db4","Type":"ContainerStarted","Data":"dbcd885409a82e719a9d50f03e3703c64c2b5ded15ff95a554f33279a4d042eb"} Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.307411 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rgwf8" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.308417 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" podUID="ce730c10-9854-4705-bac9-07fc1f23402c" containerName="controller-manager" containerID="cri-o://2571b91d1d6ca9a6db234b9074f24029e94985c3ccaeee07b2f61e6099f0f31a" gracePeriod=30 Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.313099 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.313757 4696 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rgwf8 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.313846 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rgwf8" podUID="d651c778-1304-44b3-b21a-4c3ba2158db4" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Mar 18 15:38:58 crc kubenswrapper[4696]: E0318 15:38:58.315790 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:58.815772726 +0000 UTC m=+181.821946932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.346120 4696 patch_prober.go:28] interesting pod/downloads-7954f5f757-498px container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.346484 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-498px" podUID="f7245cec-cee9-4aa5-8087-81ad2f450977" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.393619 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7p8b" podStartSLOduration=133.393604544 podStartE2EDuration="2m13.393604544s" podCreationTimestamp="2026-03-18 15:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:58.391969283 +0000 UTC m=+181.398143489" watchObservedRunningTime="2026-03-18 15:38:58.393604544 +0000 UTC m=+181.399778750" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.426170 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:58 crc kubenswrapper[4696]: E0318 15:38:58.426605 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:58.926587854 +0000 UTC m=+181.932762060 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.462695 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-x2rv5" podStartSLOduration=133.462664942 podStartE2EDuration="2m13.462664942s" podCreationTimestamp="2026-03-18 15:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:58.435718544 +0000 UTC m=+181.441892760" watchObservedRunningTime="2026-03-18 15:38:58.462664942 +0000 UTC m=+181.468839138" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.487581 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk"] Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.492539 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-r4wjj" podStartSLOduration=133.492502653 podStartE2EDuration="2m13.492502653s" podCreationTimestamp="2026-03-18 15:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:58.491835157 +0000 UTC m=+181.498009363" watchObservedRunningTime="2026-03-18 15:38:58.492502653 +0000 UTC m=+181.498676859" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.527528 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:58 crc kubenswrapper[4696]: E0318 15:38:58.527875 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:59.027836873 +0000 UTC m=+182.034011079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.544670 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2"] Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.545999 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-kn2g2"] Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.625537 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ljcg6" podStartSLOduration=134.62548546 podStartE2EDuration="2m14.62548546s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:58.577234556 +0000 UTC m=+181.583408762" watchObservedRunningTime="2026-03-18 15:38:58.62548546 +0000 UTC m=+181.631659666" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.628274 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:58 crc kubenswrapper[4696]: E0318 15:38:58.628724 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:59.128710471 +0000 UTC m=+182.134884677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.678059 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5ljxk" podStartSLOduration=133.678041983 podStartE2EDuration="2m13.678041983s" podCreationTimestamp="2026-03-18 15:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:58.627635384 +0000 UTC m=+181.633809590" watchObservedRunningTime="2026-03-18 15:38:58.678041983 +0000 UTC m=+181.684216189" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.708608 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-9wfks" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.725814 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-9wfks" podStartSLOduration=134.725793745 podStartE2EDuration="2m14.725793745s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:58.725002255 +0000 UTC m=+181.731176461" watchObservedRunningTime="2026-03-18 15:38:58.725793745 +0000 UTC m=+181.731967951" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.726065 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rgwf8" podStartSLOduration=133.726060781 podStartE2EDuration="2m13.726060781s" podCreationTimestamp="2026-03-18 15:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:58.69817478 +0000 UTC m=+181.704348986" watchObservedRunningTime="2026-03-18 15:38:58.726060781 +0000 UTC m=+181.732234987" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.732373 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:58 crc kubenswrapper[4696]: E0318 15:38:58.732728 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:59.232715409 +0000 UTC m=+182.238889615 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.752209 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m4tfr" podStartSLOduration=133.752192849 podStartE2EDuration="2m13.752192849s" podCreationTimestamp="2026-03-18 15:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:58.751642045 +0000 UTC m=+181.757816251" watchObservedRunningTime="2026-03-18 15:38:58.752192849 +0000 UTC m=+181.758367055" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.833749 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:58 crc kubenswrapper[4696]: E0318 15:38:58.834157 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:59.334130871 +0000 UTC m=+182.340305077 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.839103 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2x748" podStartSLOduration=9.839070396 podStartE2EDuration="9.839070396s" podCreationTimestamp="2026-03-18 15:38:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:58.791306183 +0000 UTC m=+181.797480389" watchObservedRunningTime="2026-03-18 15:38:58.839070396 +0000 UTC m=+181.845244602" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.935510 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.935685 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/701f97fc-e026-4b52-ac03-e4bccbf34972-metrics-certs\") pod \"network-metrics-daemon-k88c8\" (UID: \"701f97fc-e026-4b52-ac03-e4bccbf34972\") " pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:58 crc kubenswrapper[4696]: E0318 15:38:58.936515 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:59.436499978 +0000 UTC m=+182.442674184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.968870 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 15:38:58 crc kubenswrapper[4696]: I0318 15:38:58.988898 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/701f97fc-e026-4b52-ac03-e4bccbf34972-metrics-certs\") pod \"network-metrics-daemon-k88c8\" (UID: \"701f97fc-e026-4b52-ac03-e4bccbf34972\") " pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.036398 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:59 crc kubenswrapper[4696]: E0318 15:38:59.037034 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:59.537016257 +0000 UTC m=+182.543190463 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.041913 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5r8lf" podStartSLOduration=134.04189631 podStartE2EDuration="2m14.04189631s" podCreationTimestamp="2026-03-18 15:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:58.912005131 +0000 UTC m=+181.918179347" watchObservedRunningTime="2026-03-18 15:38:59.04189631 +0000 UTC m=+182.048070516" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.091186 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dshv6" podStartSLOduration=135.09117126 podStartE2EDuration="2m15.09117126s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:59.042197388 +0000 UTC m=+182.048371594" watchObservedRunningTime="2026-03-18 15:38:59.09117126 +0000 UTC m=+182.097345466" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.112170 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.138166 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce730c10-9854-4705-bac9-07fc1f23402c-serving-cert\") pod \"ce730c10-9854-4705-bac9-07fc1f23402c\" (UID: \"ce730c10-9854-4705-bac9-07fc1f23402c\") " Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.138221 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce730c10-9854-4705-bac9-07fc1f23402c-config\") pod \"ce730c10-9854-4705-bac9-07fc1f23402c\" (UID: \"ce730c10-9854-4705-bac9-07fc1f23402c\") " Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.138247 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh9kd\" (UniqueName: \"kubernetes.io/projected/ce730c10-9854-4705-bac9-07fc1f23402c-kube-api-access-nh9kd\") pod \"ce730c10-9854-4705-bac9-07fc1f23402c\" (UID: \"ce730c10-9854-4705-bac9-07fc1f23402c\") " Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.138402 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ce730c10-9854-4705-bac9-07fc1f23402c-proxy-ca-bundles\") pod \"ce730c10-9854-4705-bac9-07fc1f23402c\" (UID: \"ce730c10-9854-4705-bac9-07fc1f23402c\") " Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.138418 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce730c10-9854-4705-bac9-07fc1f23402c-client-ca\") pod \"ce730c10-9854-4705-bac9-07fc1f23402c\" (UID: \"ce730c10-9854-4705-bac9-07fc1f23402c\") " Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.138534 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:59 crc kubenswrapper[4696]: E0318 15:38:59.138846 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:59.63883311 +0000 UTC m=+182.645007316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.143318 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce730c10-9854-4705-bac9-07fc1f23402c-client-ca" (OuterVolumeSpecName: "client-ca") pod "ce730c10-9854-4705-bac9-07fc1f23402c" (UID: "ce730c10-9854-4705-bac9-07fc1f23402c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.143379 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce730c10-9854-4705-bac9-07fc1f23402c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ce730c10-9854-4705-bac9-07fc1f23402c" (UID: "ce730c10-9854-4705-bac9-07fc1f23402c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.148093 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce730c10-9854-4705-bac9-07fc1f23402c-config" (OuterVolumeSpecName: "config") pod "ce730c10-9854-4705-bac9-07fc1f23402c" (UID: "ce730c10-9854-4705-bac9-07fc1f23402c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.153672 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce730c10-9854-4705-bac9-07fc1f23402c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ce730c10-9854-4705-bac9-07fc1f23402c" (UID: "ce730c10-9854-4705-bac9-07fc1f23402c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.154179 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce730c10-9854-4705-bac9-07fc1f23402c-kube-api-access-nh9kd" (OuterVolumeSpecName: "kube-api-access-nh9kd") pod "ce730c10-9854-4705-bac9-07fc1f23402c" (UID: "ce730c10-9854-4705-bac9-07fc1f23402c"). InnerVolumeSpecName "kube-api-access-nh9kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.171282 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9cb8" podStartSLOduration=134.171264536 podStartE2EDuration="2m14.171264536s" podCreationTimestamp="2026-03-18 15:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:59.092144655 +0000 UTC m=+182.098318861" watchObservedRunningTime="2026-03-18 15:38:59.171264536 +0000 UTC m=+182.177438742" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.172152 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" podStartSLOduration=135.172146568 podStartE2EDuration="2m15.172146568s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:59.17022459 +0000 UTC m=+182.176398806" watchObservedRunningTime="2026-03-18 15:38:59.172146568 +0000 UTC m=+182.178320774" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.206738 4696 patch_prober.go:28] interesting pod/router-default-5444994796-xk9kb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:38:59 crc kubenswrapper[4696]: [-]has-synced failed: reason withheld Mar 18 15:38:59 crc kubenswrapper[4696]: [+]process-running ok Mar 18 15:38:59 crc kubenswrapper[4696]: healthz check failed Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.206810 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xk9kb" podUID="94443ebd-69c4-4f6b-90a6-13cd2da51741" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.217872 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.228675 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k88c8" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.239262 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.239711 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce730c10-9854-4705-bac9-07fc1f23402c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.239738 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce730c10-9854-4705-bac9-07fc1f23402c-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.239750 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh9kd\" (UniqueName: \"kubernetes.io/projected/ce730c10-9854-4705-bac9-07fc1f23402c-kube-api-access-nh9kd\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.239763 4696 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ce730c10-9854-4705-bac9-07fc1f23402c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.239776 4696 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce730c10-9854-4705-bac9-07fc1f23402c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:59 crc kubenswrapper[4696]: E0318 15:38:59.239847 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:59.739831592 +0000 UTC m=+182.746005798 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.340541 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:59 crc kubenswrapper[4696]: E0318 15:38:59.341449 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:38:59.841435529 +0000 UTC m=+182.847609725 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.370232 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vq9f" event={"ID":"c59c714a-1145-4630-9d30-24d15369e2b6","Type":"ContainerStarted","Data":"f3c252847d7350deddacacf0682a0ffcf63ddfe5fee57a94ca04a844e8b250e0"} Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.395893 4696 ???:1] "http: TLS handshake error from 192.168.126.11:33352: no serving certificate available for the kubelet" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.397220 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vq9f" podStartSLOduration=135.397205672 podStartE2EDuration="2m15.397205672s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:59.396745851 +0000 UTC m=+182.402920057" watchObservedRunningTime="2026-03-18 15:38:59.397205672 +0000 UTC m=+182.403379868" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.406936 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8286f" event={"ID":"10213877-0b4a-463d-b286-f2a0fb2e3fd6","Type":"ContainerStarted","Data":"2e0fd952c723f27ba47036cd2edea257fe40b01e0fcd22c153c9a2d81996c82c"} Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.407028 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-8286f" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.432114 4696 generic.go:334] "Generic (PLEG): container finished" podID="ce730c10-9854-4705-bac9-07fc1f23402c" containerID="2571b91d1d6ca9a6db234b9074f24029e94985c3ccaeee07b2f61e6099f0f31a" exitCode=0 Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.432226 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" event={"ID":"ce730c10-9854-4705-bac9-07fc1f23402c","Type":"ContainerDied","Data":"2571b91d1d6ca9a6db234b9074f24029e94985c3ccaeee07b2f61e6099f0f31a"} Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.432257 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" event={"ID":"ce730c10-9854-4705-bac9-07fc1f23402c","Type":"ContainerDied","Data":"cf7fd7a402f91b8d41dbf8d6de7c2936280f5d9cddbbad7090bd757d975e6807"} Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.432272 4696 scope.go:117] "RemoveContainer" containerID="2571b91d1d6ca9a6db234b9074f24029e94985c3ccaeee07b2f61e6099f0f31a" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.432625 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mcsh8" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.434633 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8286f" podStartSLOduration=10.434613054 podStartE2EDuration="10.434613054s" podCreationTimestamp="2026-03-18 15:38:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:59.432210933 +0000 UTC m=+182.438385150" watchObservedRunningTime="2026-03-18 15:38:59.434613054 +0000 UTC m=+182.440787290" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.442221 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:59 crc kubenswrapper[4696]: E0318 15:38:59.444148 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:59.944130733 +0000 UTC m=+182.950304939 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.447112 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7p8b" event={"ID":"5f9e19eb-eadb-478e-b46d-f717a7a7c3de","Type":"ContainerStarted","Data":"07be96edc9b87aba390d7ab20edc12c6c7201a165e4227ff7a39f5eda763d118"} Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.450217 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" event={"ID":"f3bc2762-13c0-4c5b-b3c9-af4f68e1858e","Type":"ContainerStarted","Data":"9f385fb94456f7ff8ed88483a78e29412268400590e8fe48529cfec741f41113"} Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.452966 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-x2rv5" event={"ID":"6f523e3f-cda8-49a8-871f-06d20ce5834e","Type":"ContainerStarted","Data":"5266bc54ae8139c08c43388eed53a75420db7e75c3015fcdf10c3d979bfa51a1"} Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.476035 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" event={"ID":"eb66a6d0-42a1-4e17-8478-17f0b32a2369","Type":"ContainerStarted","Data":"e227c4f4f6b2d48ef02fb017a5b8d7aa6fabc1a456fbc7b5aafc87772bfe3b06"} Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.476091 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" event={"ID":"eb66a6d0-42a1-4e17-8478-17f0b32a2369","Type":"ContainerStarted","Data":"f291e750a473bd5e7ac32aec24157bccf5852ae4af76e984c6daf8dcf025046a"} Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.476447 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.489854 4696 scope.go:117] "RemoveContainer" containerID="2571b91d1d6ca9a6db234b9074f24029e94985c3ccaeee07b2f61e6099f0f31a" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.490069 4696 patch_prober.go:28] interesting pod/route-controller-manager-5cf487b88-2zcgk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.490100 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" podUID="eb66a6d0-42a1-4e17-8478-17f0b32a2369" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 18 15:38:59 crc kubenswrapper[4696]: E0318 15:38:59.493412 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2571b91d1d6ca9a6db234b9074f24029e94985c3ccaeee07b2f61e6099f0f31a\": container with ID starting with 2571b91d1d6ca9a6db234b9074f24029e94985c3ccaeee07b2f61e6099f0f31a not found: ID does not exist" containerID="2571b91d1d6ca9a6db234b9074f24029e94985c3ccaeee07b2f61e6099f0f31a" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.493446 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2571b91d1d6ca9a6db234b9074f24029e94985c3ccaeee07b2f61e6099f0f31a"} err="failed to get container status \"2571b91d1d6ca9a6db234b9074f24029e94985c3ccaeee07b2f61e6099f0f31a\": rpc error: code = NotFound desc = could not find container \"2571b91d1d6ca9a6db234b9074f24029e94985c3ccaeee07b2f61e6099f0f31a\": container with ID starting with 2571b91d1d6ca9a6db234b9074f24029e94985c3ccaeee07b2f61e6099f0f31a not found: ID does not exist" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.500848 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" event={"ID":"cb247c8f-8684-477b-8c7a-4a222474a497","Type":"ContainerStarted","Data":"99d651fca7a33c9839268ba1d007bed201359725986f1b7d71747c7b40837ee3"} Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.518187 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" podStartSLOduration=134.518168427 podStartE2EDuration="2m14.518168427s" podCreationTimestamp="2026-03-18 15:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:59.493128997 +0000 UTC m=+182.499303203" watchObservedRunningTime="2026-03-18 15:38:59.518168427 +0000 UTC m=+182.524342633" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.521245 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mcsh8"] Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.529145 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mcsh8"] Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.543369 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:59 crc kubenswrapper[4696]: E0318 15:38:59.543772 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:39:00.043760651 +0000 UTC m=+183.049934857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.546839 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-69fs4" event={"ID":"55f95b82-6e61-4b39-a2e6-6685d37d61e8","Type":"ContainerStarted","Data":"a0d0c36d39b4e3950d5ddd0dcf1a2aec1d96eeaf41494515cee8351f8aff2279"} Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.549609 4696 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-r4wjj container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.549650 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-r4wjj" podUID="e4224061-726c-4bee-84d0-9b1dfddcdaa4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.562428 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5r8lf" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.563171 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" podStartSLOduration=3.563153379 podStartE2EDuration="3.563153379s" podCreationTimestamp="2026-03-18 15:38:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:59.556207894 +0000 UTC m=+182.562382100" watchObservedRunningTime="2026-03-18 15:38:59.563153379 +0000 UTC m=+182.569327585" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.570847 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5ljxk" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.584366 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-69fs4" podStartSLOduration=135.584348762 podStartE2EDuration="2m15.584348762s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:59.579418488 +0000 UTC m=+182.585592684" watchObservedRunningTime="2026-03-18 15:38:59.584348762 +0000 UTC m=+182.590522968" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.650655 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6eeff11-0e98-48f5-a868-a7016d57be14" path="/var/lib/kubelet/pods/a6eeff11-0e98-48f5-a868-a7016d57be14/volumes" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.654986 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:59 crc kubenswrapper[4696]: E0318 15:38:59.655843 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:39:00.155828621 +0000 UTC m=+183.162002817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.674638 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce730c10-9854-4705-bac9-07fc1f23402c" path="/var/lib/kubelet/pods/ce730c10-9854-4705-bac9-07fc1f23402c/volumes" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.724922 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b59fd8595-dnvzb"] Mar 18 15:38:59 crc kubenswrapper[4696]: E0318 15:38:59.725431 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce730c10-9854-4705-bac9-07fc1f23402c" containerName="controller-manager" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.725444 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce730c10-9854-4705-bac9-07fc1f23402c" containerName="controller-manager" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.737613 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce730c10-9854-4705-bac9-07fc1f23402c" containerName="controller-manager" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.738301 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.742587 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-k88c8"] Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.744933 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.745131 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.745254 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.745686 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.745818 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.746274 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b59fd8595-dnvzb"] Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.815907 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3afda99d-4887-4fc9-9992-ac10eea0142b-client-ca\") pod \"controller-manager-7b59fd8595-dnvzb\" (UID: \"3afda99d-4887-4fc9-9992-ac10eea0142b\") " pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.816009 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.816054 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3afda99d-4887-4fc9-9992-ac10eea0142b-config\") pod \"controller-manager-7b59fd8595-dnvzb\" (UID: \"3afda99d-4887-4fc9-9992-ac10eea0142b\") " pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.816138 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3afda99d-4887-4fc9-9992-ac10eea0142b-serving-cert\") pod \"controller-manager-7b59fd8595-dnvzb\" (UID: \"3afda99d-4887-4fc9-9992-ac10eea0142b\") " pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.816184 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grmsx\" (UniqueName: \"kubernetes.io/projected/3afda99d-4887-4fc9-9992-ac10eea0142b-kube-api-access-grmsx\") pod \"controller-manager-7b59fd8595-dnvzb\" (UID: \"3afda99d-4887-4fc9-9992-ac10eea0142b\") " pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.816207 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3afda99d-4887-4fc9-9992-ac10eea0142b-proxy-ca-bundles\") pod \"controller-manager-7b59fd8595-dnvzb\" (UID: \"3afda99d-4887-4fc9-9992-ac10eea0142b\") " pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" Mar 18 15:38:59 crc kubenswrapper[4696]: E0318 15:38:59.816576 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:39:00.316560647 +0000 UTC m=+183.322734853 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.816947 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.828672 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.894898 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rgwf8" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.919054 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:59 crc kubenswrapper[4696]: E0318 15:38:59.919185 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:39:00.419165229 +0000 UTC m=+183.425339435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.919785 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3afda99d-4887-4fc9-9992-ac10eea0142b-client-ca\") pod \"controller-manager-7b59fd8595-dnvzb\" (UID: \"3afda99d-4887-4fc9-9992-ac10eea0142b\") " pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.919903 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.919946 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3afda99d-4887-4fc9-9992-ac10eea0142b-config\") pod \"controller-manager-7b59fd8595-dnvzb\" (UID: \"3afda99d-4887-4fc9-9992-ac10eea0142b\") " pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.920014 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3afda99d-4887-4fc9-9992-ac10eea0142b-serving-cert\") pod \"controller-manager-7b59fd8595-dnvzb\" (UID: \"3afda99d-4887-4fc9-9992-ac10eea0142b\") " pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.920054 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grmsx\" (UniqueName: \"kubernetes.io/projected/3afda99d-4887-4fc9-9992-ac10eea0142b-kube-api-access-grmsx\") pod \"controller-manager-7b59fd8595-dnvzb\" (UID: \"3afda99d-4887-4fc9-9992-ac10eea0142b\") " pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.920081 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3afda99d-4887-4fc9-9992-ac10eea0142b-proxy-ca-bundles\") pod \"controller-manager-7b59fd8595-dnvzb\" (UID: \"3afda99d-4887-4fc9-9992-ac10eea0142b\") " pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.921693 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3afda99d-4887-4fc9-9992-ac10eea0142b-proxy-ca-bundles\") pod \"controller-manager-7b59fd8595-dnvzb\" (UID: \"3afda99d-4887-4fc9-9992-ac10eea0142b\") " pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.922064 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3afda99d-4887-4fc9-9992-ac10eea0142b-config\") pod \"controller-manager-7b59fd8595-dnvzb\" (UID: \"3afda99d-4887-4fc9-9992-ac10eea0142b\") " pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.922774 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3afda99d-4887-4fc9-9992-ac10eea0142b-client-ca\") pod \"controller-manager-7b59fd8595-dnvzb\" (UID: \"3afda99d-4887-4fc9-9992-ac10eea0142b\") " pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" Mar 18 15:38:59 crc kubenswrapper[4696]: E0318 15:38:59.923381 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:39:00.423367775 +0000 UTC m=+183.429542061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.932034 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3afda99d-4887-4fc9-9992-ac10eea0142b-serving-cert\") pod \"controller-manager-7b59fd8595-dnvzb\" (UID: \"3afda99d-4887-4fc9-9992-ac10eea0142b\") " pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" Mar 18 15:38:59 crc kubenswrapper[4696]: I0318 15:38:59.952274 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grmsx\" (UniqueName: \"kubernetes.io/projected/3afda99d-4887-4fc9-9992-ac10eea0142b-kube-api-access-grmsx\") pod \"controller-manager-7b59fd8595-dnvzb\" (UID: \"3afda99d-4887-4fc9-9992-ac10eea0142b\") " pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.023069 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:39:00 crc kubenswrapper[4696]: E0318 15:39:00.023459 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:39:00.523444473 +0000 UTC m=+183.529618679 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.124486 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:39:00 crc kubenswrapper[4696]: E0318 15:39:00.124864 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:39:00.624850115 +0000 UTC m=+183.631024321 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.155003 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.194957 4696 patch_prober.go:28] interesting pod/router-default-5444994796-xk9kb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:39:00 crc kubenswrapper[4696]: [-]has-synced failed: reason withheld Mar 18 15:39:00 crc kubenswrapper[4696]: [+]process-running ok Mar 18 15:39:00 crc kubenswrapper[4696]: healthz check failed Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.195243 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xk9kb" podUID="94443ebd-69c4-4f6b-90a6-13cd2da51741" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.225261 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:39:00 crc kubenswrapper[4696]: E0318 15:39:00.225738 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:39:00.725685053 +0000 UTC m=+183.731859259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.225911 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:39:00 crc kubenswrapper[4696]: E0318 15:39:00.226601 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:39:00.726589276 +0000 UTC m=+183.732763482 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.244613 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xd5v7"] Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.245855 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xd5v7" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.251081 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.259314 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xd5v7"] Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.334570 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.334918 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa06cecb-1f9f-431d-933f-0e87033cd695-catalog-content\") pod \"community-operators-xd5v7\" (UID: \"aa06cecb-1f9f-431d-933f-0e87033cd695\") " pod="openshift-marketplace/community-operators-xd5v7" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.334993 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa06cecb-1f9f-431d-933f-0e87033cd695-utilities\") pod \"community-operators-xd5v7\" (UID: \"aa06cecb-1f9f-431d-933f-0e87033cd695\") " pod="openshift-marketplace/community-operators-xd5v7" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.335049 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27rp6\" (UniqueName: \"kubernetes.io/projected/aa06cecb-1f9f-431d-933f-0e87033cd695-kube-api-access-27rp6\") pod \"community-operators-xd5v7\" (UID: \"aa06cecb-1f9f-431d-933f-0e87033cd695\") " pod="openshift-marketplace/community-operators-xd5v7" Mar 18 15:39:00 crc kubenswrapper[4696]: E0318 15:39:00.335236 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:39:00.835183628 +0000 UTC m=+183.841357834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.431336 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8tkdr"] Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.432738 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8tkdr" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.435342 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.436001 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa06cecb-1f9f-431d-933f-0e87033cd695-catalog-content\") pod \"community-operators-xd5v7\" (UID: \"aa06cecb-1f9f-431d-933f-0e87033cd695\") " pod="openshift-marketplace/community-operators-xd5v7" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.436074 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa06cecb-1f9f-431d-933f-0e87033cd695-utilities\") pod \"community-operators-xd5v7\" (UID: \"aa06cecb-1f9f-431d-933f-0e87033cd695\") " pod="openshift-marketplace/community-operators-xd5v7" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.436109 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.436160 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27rp6\" (UniqueName: \"kubernetes.io/projected/aa06cecb-1f9f-431d-933f-0e87033cd695-kube-api-access-27rp6\") pod \"community-operators-xd5v7\" (UID: \"aa06cecb-1f9f-431d-933f-0e87033cd695\") " pod="openshift-marketplace/community-operators-xd5v7" Mar 18 15:39:00 crc kubenswrapper[4696]: E0318 15:39:00.436908 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:39:00.936892508 +0000 UTC m=+183.943066704 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.437134 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa06cecb-1f9f-431d-933f-0e87033cd695-catalog-content\") pod \"community-operators-xd5v7\" (UID: \"aa06cecb-1f9f-431d-933f-0e87033cd695\") " pod="openshift-marketplace/community-operators-xd5v7" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.440938 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa06cecb-1f9f-431d-933f-0e87033cd695-utilities\") pod \"community-operators-xd5v7\" (UID: \"aa06cecb-1f9f-431d-933f-0e87033cd695\") " pod="openshift-marketplace/community-operators-xd5v7" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.498967 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27rp6\" (UniqueName: \"kubernetes.io/projected/aa06cecb-1f9f-431d-933f-0e87033cd695-kube-api-access-27rp6\") pod \"community-operators-xd5v7\" (UID: \"aa06cecb-1f9f-431d-933f-0e87033cd695\") " pod="openshift-marketplace/community-operators-xd5v7" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.510220 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8tkdr"] Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.537450 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:39:00 crc kubenswrapper[4696]: E0318 15:39:00.537659 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:39:01.037628463 +0000 UTC m=+184.043802669 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.537771 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.537891 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e22cc1d-032f-4f3a-a0ca-51708beef610-utilities\") pod \"certified-operators-8tkdr\" (UID: \"4e22cc1d-032f-4f3a-a0ca-51708beef610\") " pod="openshift-marketplace/certified-operators-8tkdr" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.537962 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e22cc1d-032f-4f3a-a0ca-51708beef610-catalog-content\") pod \"certified-operators-8tkdr\" (UID: \"4e22cc1d-032f-4f3a-a0ca-51708beef610\") " pod="openshift-marketplace/certified-operators-8tkdr" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.538023 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tz9b\" (UniqueName: \"kubernetes.io/projected/4e22cc1d-032f-4f3a-a0ca-51708beef610-kube-api-access-2tz9b\") pod \"certified-operators-8tkdr\" (UID: \"4e22cc1d-032f-4f3a-a0ca-51708beef610\") " pod="openshift-marketplace/certified-operators-8tkdr" Mar 18 15:39:00 crc kubenswrapper[4696]: E0318 15:39:00.538357 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:39:01.038348781 +0000 UTC m=+184.044522987 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.563552 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" event={"ID":"cb247c8f-8684-477b-8c7a-4a222474a497","Type":"ContainerStarted","Data":"0d926313b9c6e5b7ece6efc58e158a34a909abc813d12588d309c7e1499539f5"} Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.571877 4696 generic.go:334] "Generic (PLEG): container finished" podID="8339464b-c883-44d7-95eb-57c32689e91b" containerID="e3f8d242f60c1385fcd94796dad9cc2ca4dab05bff3bd93122f0a873f540d475" exitCode=0 Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.571970 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-4tdt6" event={"ID":"8339464b-c883-44d7-95eb-57c32689e91b","Type":"ContainerDied","Data":"e3f8d242f60c1385fcd94796dad9cc2ca4dab05bff3bd93122f0a873f540d475"} Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.576779 4696 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.578343 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k88c8" event={"ID":"701f97fc-e026-4b52-ac03-e4bccbf34972","Type":"ContainerStarted","Data":"df11403b312c800d3aa45907b41f57c8c37d5487e3e0c95d19ef705a3db04ffa"} Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.578413 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k88c8" event={"ID":"701f97fc-e026-4b52-ac03-e4bccbf34972","Type":"ContainerStarted","Data":"abc28f7a6159961049a922b1c7273bbc7c5747740818d727f5f016d22dce4cbe"} Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.584992 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xd5v7" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.598738 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-r4wjj" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.599876 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.640696 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.640908 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e22cc1d-032f-4f3a-a0ca-51708beef610-catalog-content\") pod \"certified-operators-8tkdr\" (UID: \"4e22cc1d-032f-4f3a-a0ca-51708beef610\") " pod="openshift-marketplace/certified-operators-8tkdr" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.640965 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tz9b\" (UniqueName: \"kubernetes.io/projected/4e22cc1d-032f-4f3a-a0ca-51708beef610-kube-api-access-2tz9b\") pod \"certified-operators-8tkdr\" (UID: \"4e22cc1d-032f-4f3a-a0ca-51708beef610\") " pod="openshift-marketplace/certified-operators-8tkdr" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.641091 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e22cc1d-032f-4f3a-a0ca-51708beef610-utilities\") pod \"certified-operators-8tkdr\" (UID: \"4e22cc1d-032f-4f3a-a0ca-51708beef610\") " pod="openshift-marketplace/certified-operators-8tkdr" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.642125 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e22cc1d-032f-4f3a-a0ca-51708beef610-utilities\") pod \"certified-operators-8tkdr\" (UID: \"4e22cc1d-032f-4f3a-a0ca-51708beef610\") " pod="openshift-marketplace/certified-operators-8tkdr" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.642231 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e22cc1d-032f-4f3a-a0ca-51708beef610-catalog-content\") pod \"certified-operators-8tkdr\" (UID: \"4e22cc1d-032f-4f3a-a0ca-51708beef610\") " pod="openshift-marketplace/certified-operators-8tkdr" Mar 18 15:39:00 crc kubenswrapper[4696]: E0318 15:39:00.642378 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:39:01.142354119 +0000 UTC m=+184.148528365 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.658691 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sjnmp"] Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.660146 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sjnmp" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.672164 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sjnmp"] Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.697711 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tz9b\" (UniqueName: \"kubernetes.io/projected/4e22cc1d-032f-4f3a-a0ca-51708beef610-kube-api-access-2tz9b\") pod \"certified-operators-8tkdr\" (UID: \"4e22cc1d-032f-4f3a-a0ca-51708beef610\") " pod="openshift-marketplace/certified-operators-8tkdr" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.745292 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4xdg\" (UniqueName: \"kubernetes.io/projected/b0569eea-b948-4633-91d7-4ebfa02d5a8b-kube-api-access-g4xdg\") pod \"community-operators-sjnmp\" (UID: \"b0569eea-b948-4633-91d7-4ebfa02d5a8b\") " pod="openshift-marketplace/community-operators-sjnmp" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.745866 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0569eea-b948-4633-91d7-4ebfa02d5a8b-utilities\") pod \"community-operators-sjnmp\" (UID: \"b0569eea-b948-4633-91d7-4ebfa02d5a8b\") " pod="openshift-marketplace/community-operators-sjnmp" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.746130 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.746221 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0569eea-b948-4633-91d7-4ebfa02d5a8b-catalog-content\") pod \"community-operators-sjnmp\" (UID: \"b0569eea-b948-4633-91d7-4ebfa02d5a8b\") " pod="openshift-marketplace/community-operators-sjnmp" Mar 18 15:39:00 crc kubenswrapper[4696]: E0318 15:39:00.748212 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:39:01.248184522 +0000 UTC m=+184.254358928 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.755344 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8tkdr" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.848752 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.849687 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0569eea-b948-4633-91d7-4ebfa02d5a8b-catalog-content\") pod \"community-operators-sjnmp\" (UID: \"b0569eea-b948-4633-91d7-4ebfa02d5a8b\") " pod="openshift-marketplace/community-operators-sjnmp" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.849970 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4xdg\" (UniqueName: \"kubernetes.io/projected/b0569eea-b948-4633-91d7-4ebfa02d5a8b-kube-api-access-g4xdg\") pod \"community-operators-sjnmp\" (UID: \"b0569eea-b948-4633-91d7-4ebfa02d5a8b\") " pod="openshift-marketplace/community-operators-sjnmp" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.850204 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0569eea-b948-4633-91d7-4ebfa02d5a8b-utilities\") pod \"community-operators-sjnmp\" (UID: \"b0569eea-b948-4633-91d7-4ebfa02d5a8b\") " pod="openshift-marketplace/community-operators-sjnmp" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.851330 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0569eea-b948-4633-91d7-4ebfa02d5a8b-utilities\") pod \"community-operators-sjnmp\" (UID: \"b0569eea-b948-4633-91d7-4ebfa02d5a8b\") " pod="openshift-marketplace/community-operators-sjnmp" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.851860 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0569eea-b948-4633-91d7-4ebfa02d5a8b-catalog-content\") pod \"community-operators-sjnmp\" (UID: \"b0569eea-b948-4633-91d7-4ebfa02d5a8b\") " pod="openshift-marketplace/community-operators-sjnmp" Mar 18 15:39:00 crc kubenswrapper[4696]: E0318 15:39:00.852048 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:39:01.352020345 +0000 UTC m=+184.358194551 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.853807 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ldd6n"] Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.874912 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ldd6n"] Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.875035 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ldd6n" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.886444 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b59fd8595-dnvzb"] Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.890707 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4xdg\" (UniqueName: \"kubernetes.io/projected/b0569eea-b948-4633-91d7-4ebfa02d5a8b-kube-api-access-g4xdg\") pod \"community-operators-sjnmp\" (UID: \"b0569eea-b948-4633-91d7-4ebfa02d5a8b\") " pod="openshift-marketplace/community-operators-sjnmp" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.935505 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.937433 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.940239 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.940477 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.944579 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.954487 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46xmj\" (UniqueName: \"kubernetes.io/projected/459c4b74-f710-4b3a-b053-8b0326b87cb5-kube-api-access-46xmj\") pod \"certified-operators-ldd6n\" (UID: \"459c4b74-f710-4b3a-b053-8b0326b87cb5\") " pod="openshift-marketplace/certified-operators-ldd6n" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.954551 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459c4b74-f710-4b3a-b053-8b0326b87cb5-utilities\") pod \"certified-operators-ldd6n\" (UID: \"459c4b74-f710-4b3a-b053-8b0326b87cb5\") " pod="openshift-marketplace/certified-operators-ldd6n" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.954587 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459c4b74-f710-4b3a-b053-8b0326b87cb5-catalog-content\") pod \"certified-operators-ldd6n\" (UID: \"459c4b74-f710-4b3a-b053-8b0326b87cb5\") " pod="openshift-marketplace/certified-operators-ldd6n" Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.954660 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:39:00 crc kubenswrapper[4696]: E0318 15:39:00.954994 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:39:01.454971737 +0000 UTC m=+184.461145943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g894v" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.976975 4696 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-18T15:39:00.576931972Z","Handler":null,"Name":""} Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.989435 4696 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.989495 4696 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 18 15:39:00 crc kubenswrapper[4696]: I0318 15:39:00.989784 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sjnmp" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.055758 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.056102 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65eafdb4-acf5-4d6d-8e11-792ea23cf5a8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"65eafdb4-acf5-4d6d-8e11-792ea23cf5a8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.056144 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65eafdb4-acf5-4d6d-8e11-792ea23cf5a8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"65eafdb4-acf5-4d6d-8e11-792ea23cf5a8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.056183 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46xmj\" (UniqueName: \"kubernetes.io/projected/459c4b74-f710-4b3a-b053-8b0326b87cb5-kube-api-access-46xmj\") pod \"certified-operators-ldd6n\" (UID: \"459c4b74-f710-4b3a-b053-8b0326b87cb5\") " pod="openshift-marketplace/certified-operators-ldd6n" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.057051 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459c4b74-f710-4b3a-b053-8b0326b87cb5-utilities\") pod \"certified-operators-ldd6n\" (UID: \"459c4b74-f710-4b3a-b053-8b0326b87cb5\") " pod="openshift-marketplace/certified-operators-ldd6n" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.057105 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459c4b74-f710-4b3a-b053-8b0326b87cb5-catalog-content\") pod \"certified-operators-ldd6n\" (UID: \"459c4b74-f710-4b3a-b053-8b0326b87cb5\") " pod="openshift-marketplace/certified-operators-ldd6n" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.057939 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459c4b74-f710-4b3a-b053-8b0326b87cb5-catalog-content\") pod \"certified-operators-ldd6n\" (UID: \"459c4b74-f710-4b3a-b053-8b0326b87cb5\") " pod="openshift-marketplace/certified-operators-ldd6n" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.058810 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459c4b74-f710-4b3a-b053-8b0326b87cb5-utilities\") pod \"certified-operators-ldd6n\" (UID: \"459c4b74-f710-4b3a-b053-8b0326b87cb5\") " pod="openshift-marketplace/certified-operators-ldd6n" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.068241 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.083943 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46xmj\" (UniqueName: \"kubernetes.io/projected/459c4b74-f710-4b3a-b053-8b0326b87cb5-kube-api-access-46xmj\") pod \"certified-operators-ldd6n\" (UID: \"459c4b74-f710-4b3a-b053-8b0326b87cb5\") " pod="openshift-marketplace/certified-operators-ldd6n" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.148492 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8tkdr"] Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.158869 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65eafdb4-acf5-4d6d-8e11-792ea23cf5a8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"65eafdb4-acf5-4d6d-8e11-792ea23cf5a8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.158913 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65eafdb4-acf5-4d6d-8e11-792ea23cf5a8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"65eafdb4-acf5-4d6d-8e11-792ea23cf5a8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.159439 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65eafdb4-acf5-4d6d-8e11-792ea23cf5a8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"65eafdb4-acf5-4d6d-8e11-792ea23cf5a8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.158977 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.169260 4696 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.169656 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:39:01 crc kubenswrapper[4696]: W0318 15:39:01.171084 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e22cc1d_032f_4f3a_a0ca_51708beef610.slice/crio-f86dd1e0d8c1a177336ea59b01846ab0224fde14020f614154829cef7ae66e49 WatchSource:0}: Error finding container f86dd1e0d8c1a177336ea59b01846ab0224fde14020f614154829cef7ae66e49: Status 404 returned error can't find the container with id f86dd1e0d8c1a177336ea59b01846ab0224fde14020f614154829cef7ae66e49 Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.213627 4696 patch_prober.go:28] interesting pod/router-default-5444994796-xk9kb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:39:01 crc kubenswrapper[4696]: [-]has-synced failed: reason withheld Mar 18 15:39:01 crc kubenswrapper[4696]: [+]process-running ok Mar 18 15:39:01 crc kubenswrapper[4696]: healthz check failed Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.213682 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xk9kb" podUID="94443ebd-69c4-4f6b-90a6-13cd2da51741" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.220450 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65eafdb4-acf5-4d6d-8e11-792ea23cf5a8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"65eafdb4-acf5-4d6d-8e11-792ea23cf5a8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.256818 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ldd6n" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.277884 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.315864 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g894v\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.337083 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xd5v7"] Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.498092 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.503687 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.536708 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sjnmp"] Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.635512 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.638572 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" event={"ID":"3afda99d-4887-4fc9-9992-ac10eea0142b","Type":"ContainerStarted","Data":"0640d3ea7180d068688544090553dc23d1c63faccb138278813f0f6eee40ca21"} Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.638629 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" event={"ID":"3afda99d-4887-4fc9-9992-ac10eea0142b","Type":"ContainerStarted","Data":"124bba3a98a29413776748357083dec74ef312cf77d08a3edbcbe127cd478527"} Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.641219 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.649418 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.668302 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" podStartSLOduration=5.668280489 podStartE2EDuration="5.668280489s" podCreationTimestamp="2026-03-18 15:38:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:39:01.667318854 +0000 UTC m=+184.673493060" watchObservedRunningTime="2026-03-18 15:39:01.668280489 +0000 UTC m=+184.674454695" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.674493 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" event={"ID":"cb247c8f-8684-477b-8c7a-4a222474a497","Type":"ContainerStarted","Data":"089a37aacd6dc054cce4f1613a77eb5c43c09787c08b50329eebda00fa4bdc4b"} Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.675827 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xd5v7" event={"ID":"aa06cecb-1f9f-431d-933f-0e87033cd695","Type":"ContainerStarted","Data":"b65cf999813e5f1c1926a5cafce52cef7f2938456f73d425a0af85b4b47e91da"} Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.678280 4696 generic.go:334] "Generic (PLEG): container finished" podID="4e22cc1d-032f-4f3a-a0ca-51708beef610" containerID="5280f1f44dac2775438c9fee36b10e0cead41a7c2011abb566b45d899281c72e" exitCode=0 Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.678439 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tkdr" event={"ID":"4e22cc1d-032f-4f3a-a0ca-51708beef610","Type":"ContainerDied","Data":"5280f1f44dac2775438c9fee36b10e0cead41a7c2011abb566b45d899281c72e"} Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.678464 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tkdr" event={"ID":"4e22cc1d-032f-4f3a-a0ca-51708beef610","Type":"ContainerStarted","Data":"f86dd1e0d8c1a177336ea59b01846ab0224fde14020f614154829cef7ae66e49"} Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.688983 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k88c8" event={"ID":"701f97fc-e026-4b52-ac03-e4bccbf34972","Type":"ContainerStarted","Data":"20cf7bbdccb92f6ae2a48ae5a39973ed4336501680209e5e5649f7405d37c198"} Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.724702 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-k88c8" podStartSLOduration=137.724687268 podStartE2EDuration="2m17.724687268s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:39:01.721903058 +0000 UTC m=+184.728077264" watchObservedRunningTime="2026-03-18 15:39:01.724687268 +0000 UTC m=+184.730861474" Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.753316 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ldd6n"] Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.853508 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 15:39:01 crc kubenswrapper[4696]: I0318 15:39:01.989239 4696 ???:1] "http: TLS handshake error from 192.168.126.11:44962: no serving certificate available for the kubelet" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.090179 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g894v"] Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.106026 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.111432 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.123264 4696 patch_prober.go:28] interesting pod/console-f9d7485db-qbzqg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.22:8443/health\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.123755 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-qbzqg" podUID="3733dd99-82f2-4602-b0e2-ece3c16cd446" containerName="console" probeResult="failure" output="Get \"https://10.217.0.22:8443/health\": dial tcp 10.217.0.22:8443: connect: connection refused" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.171511 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-4tdt6" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.191110 4696 patch_prober.go:28] interesting pod/downloads-7954f5f757-498px container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.191190 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-498px" podUID="f7245cec-cee9-4aa5-8087-81ad2f450977" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.191121 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-xk9kb" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.194772 4696 patch_prober.go:28] interesting pod/downloads-7954f5f757-498px container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.198468 4696 patch_prober.go:28] interesting pod/router-default-5444994796-xk9kb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:39:02 crc kubenswrapper[4696]: [-]has-synced failed: reason withheld Mar 18 15:39:02 crc kubenswrapper[4696]: [+]process-running ok Mar 18 15:39:02 crc kubenswrapper[4696]: healthz check failed Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.204446 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-498px" podUID="f7245cec-cee9-4aa5-8087-81ad2f450977" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.204559 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xk9kb" podUID="94443ebd-69c4-4f6b-90a6-13cd2da51741" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.301360 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8339464b-c883-44d7-95eb-57c32689e91b-config-volume\") pod \"8339464b-c883-44d7-95eb-57c32689e91b\" (UID: \"8339464b-c883-44d7-95eb-57c32689e91b\") " Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.301422 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8339464b-c883-44d7-95eb-57c32689e91b-secret-volume\") pod \"8339464b-c883-44d7-95eb-57c32689e91b\" (UID: \"8339464b-c883-44d7-95eb-57c32689e91b\") " Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.301476 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv8hv\" (UniqueName: \"kubernetes.io/projected/8339464b-c883-44d7-95eb-57c32689e91b-kube-api-access-qv8hv\") pod \"8339464b-c883-44d7-95eb-57c32689e91b\" (UID: \"8339464b-c883-44d7-95eb-57c32689e91b\") " Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.302975 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8339464b-c883-44d7-95eb-57c32689e91b-config-volume" (OuterVolumeSpecName: "config-volume") pod "8339464b-c883-44d7-95eb-57c32689e91b" (UID: "8339464b-c883-44d7-95eb-57c32689e91b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.308955 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8339464b-c883-44d7-95eb-57c32689e91b-kube-api-access-qv8hv" (OuterVolumeSpecName: "kube-api-access-qv8hv") pod "8339464b-c883-44d7-95eb-57c32689e91b" (UID: "8339464b-c883-44d7-95eb-57c32689e91b"). InnerVolumeSpecName "kube-api-access-qv8hv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.309432 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8339464b-c883-44d7-95eb-57c32689e91b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8339464b-c883-44d7-95eb-57c32689e91b" (UID: "8339464b-c883-44d7-95eb-57c32689e91b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.354159 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.354219 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.362987 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.403016 4696 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8339464b-c883-44d7-95eb-57c32689e91b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.403055 4696 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8339464b-c883-44d7-95eb-57c32689e91b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.403067 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv8hv\" (UniqueName: \"kubernetes.io/projected/8339464b-c883-44d7-95eb-57c32689e91b-kube-api-access-qv8hv\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.424325 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nsq74"] Mar 18 15:39:02 crc kubenswrapper[4696]: E0318 15:39:02.424644 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8339464b-c883-44d7-95eb-57c32689e91b" containerName="collect-profiles" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.424665 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="8339464b-c883-44d7-95eb-57c32689e91b" containerName="collect-profiles" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.424792 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="8339464b-c883-44d7-95eb-57c32689e91b" containerName="collect-profiles" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.425690 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nsq74" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.427675 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.438428 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nsq74"] Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.504457 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87qhb\" (UniqueName: \"kubernetes.io/projected/c9ddf1c7-0e1d-4f54-a93d-1665148569b2-kube-api-access-87qhb\") pod \"redhat-marketplace-nsq74\" (UID: \"c9ddf1c7-0e1d-4f54-a93d-1665148569b2\") " pod="openshift-marketplace/redhat-marketplace-nsq74" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.504578 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ddf1c7-0e1d-4f54-a93d-1665148569b2-utilities\") pod \"redhat-marketplace-nsq74\" (UID: \"c9ddf1c7-0e1d-4f54-a93d-1665148569b2\") " pod="openshift-marketplace/redhat-marketplace-nsq74" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.504616 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ddf1c7-0e1d-4f54-a93d-1665148569b2-catalog-content\") pod \"redhat-marketplace-nsq74\" (UID: \"c9ddf1c7-0e1d-4f54-a93d-1665148569b2\") " pod="openshift-marketplace/redhat-marketplace-nsq74" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.595942 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.596489 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.607439 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87qhb\" (UniqueName: \"kubernetes.io/projected/c9ddf1c7-0e1d-4f54-a93d-1665148569b2-kube-api-access-87qhb\") pod \"redhat-marketplace-nsq74\" (UID: \"c9ddf1c7-0e1d-4f54-a93d-1665148569b2\") " pod="openshift-marketplace/redhat-marketplace-nsq74" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.607540 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ddf1c7-0e1d-4f54-a93d-1665148569b2-utilities\") pod \"redhat-marketplace-nsq74\" (UID: \"c9ddf1c7-0e1d-4f54-a93d-1665148569b2\") " pod="openshift-marketplace/redhat-marketplace-nsq74" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.607562 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ddf1c7-0e1d-4f54-a93d-1665148569b2-catalog-content\") pod \"redhat-marketplace-nsq74\" (UID: \"c9ddf1c7-0e1d-4f54-a93d-1665148569b2\") " pod="openshift-marketplace/redhat-marketplace-nsq74" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.608152 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ddf1c7-0e1d-4f54-a93d-1665148569b2-catalog-content\") pod \"redhat-marketplace-nsq74\" (UID: \"c9ddf1c7-0e1d-4f54-a93d-1665148569b2\") " pod="openshift-marketplace/redhat-marketplace-nsq74" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.608376 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ddf1c7-0e1d-4f54-a93d-1665148569b2-utilities\") pod \"redhat-marketplace-nsq74\" (UID: \"c9ddf1c7-0e1d-4f54-a93d-1665148569b2\") " pod="openshift-marketplace/redhat-marketplace-nsq74" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.609387 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.663800 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87qhb\" (UniqueName: \"kubernetes.io/projected/c9ddf1c7-0e1d-4f54-a93d-1665148569b2-kube-api-access-87qhb\") pod \"redhat-marketplace-nsq74\" (UID: \"c9ddf1c7-0e1d-4f54-a93d-1665148569b2\") " pod="openshift-marketplace/redhat-marketplace-nsq74" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.697764 4696 generic.go:334] "Generic (PLEG): container finished" podID="459c4b74-f710-4b3a-b053-8b0326b87cb5" containerID="cbc0c8dbceeccfb2f92d1b0d78837ef6e8e790791c5796d756ecf7cd99bb0064" exitCode=0 Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.697882 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ldd6n" event={"ID":"459c4b74-f710-4b3a-b053-8b0326b87cb5","Type":"ContainerDied","Data":"cbc0c8dbceeccfb2f92d1b0d78837ef6e8e790791c5796d756ecf7cd99bb0064"} Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.697928 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ldd6n" event={"ID":"459c4b74-f710-4b3a-b053-8b0326b87cb5","Type":"ContainerStarted","Data":"b02f923bfeebc6cccc44585446fb6d69cf59b1a3f79ebcefc84c84a3af457145"} Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.706492 4696 generic.go:334] "Generic (PLEG): container finished" podID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" containerID="3de20bdffa150eda91b07bfcc49ac946dc75b6d28734709ca1b4489441c2d055" exitCode=0 Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.706589 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjnmp" event={"ID":"b0569eea-b948-4633-91d7-4ebfa02d5a8b","Type":"ContainerDied","Data":"3de20bdffa150eda91b07bfcc49ac946dc75b6d28734709ca1b4489441c2d055"} Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.706975 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjnmp" event={"ID":"b0569eea-b948-4633-91d7-4ebfa02d5a8b","Type":"ContainerStarted","Data":"de5dad7bfdb6ca619e578404c5ca190b46cc313e4fc55f84ade1b85ef8e6974e"} Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.711639 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" event={"ID":"cb247c8f-8684-477b-8c7a-4a222474a497","Type":"ContainerStarted","Data":"6a3ac97c264e4acb18a7d8f9c6cba5c311183475ca95761d36153674c31d0fd6"} Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.716886 4696 generic.go:334] "Generic (PLEG): container finished" podID="aa06cecb-1f9f-431d-933f-0e87033cd695" containerID="5f6e0f6fb7a3ef6587d5be1c51cf2de5b0cf603a41ad030efa6f04c645f0bd88" exitCode=0 Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.717033 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xd5v7" event={"ID":"aa06cecb-1f9f-431d-933f-0e87033cd695","Type":"ContainerDied","Data":"5f6e0f6fb7a3ef6587d5be1c51cf2de5b0cf603a41ad030efa6f04c645f0bd88"} Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.722650 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-4tdt6" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.722645 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-4tdt6" event={"ID":"8339464b-c883-44d7-95eb-57c32689e91b","Type":"ContainerDied","Data":"3254ebcc185b176f4c788783924b8002c5de7929ec97fa4a5ca4acd553f203d5"} Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.722708 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3254ebcc185b176f4c788783924b8002c5de7929ec97fa4a5ca4acd553f203d5" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.729379 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g894v" event={"ID":"5744ac11-6c36-4634-903e-298dc7b5ce45","Type":"ContainerStarted","Data":"e85e3d189cdcdbdb9cd340e02a41b68cda153c926b52711b9f8e181d7b9ea3b1"} Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.729752 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g894v" event={"ID":"5744ac11-6c36-4634-903e-298dc7b5ce45","Type":"ContainerStarted","Data":"c7275eacd8a9e8df4984f3e3106ef4f0a7e433b98a6cfad329bdc9d48c3f20d0"} Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.729860 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.732200 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"65eafdb4-acf5-4d6d-8e11-792ea23cf5a8","Type":"ContainerStarted","Data":"9da040c5b28d8cc46094b117e28850006b8dfcaa3e84ff5b5fd10e359c45602f"} Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.732257 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"65eafdb4-acf5-4d6d-8e11-792ea23cf5a8","Type":"ContainerStarted","Data":"0e0346ac61c91cf5fdab8ce141ab1ac755d84e8d1c8688f6937b153be57bce64"} Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.739368 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nb9v4" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.742273 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-6k8ll" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.743676 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nsq74" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.778658 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-4bxnr" podStartSLOduration=13.778632034 podStartE2EDuration="13.778632034s" podCreationTimestamp="2026-03-18 15:38:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:39:02.770864678 +0000 UTC m=+185.777038884" watchObservedRunningTime="2026-03-18 15:39:02.778632034 +0000 UTC m=+185.784806240" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.868064 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cp998"] Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.889102 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-g894v" podStartSLOduration=138.889076273 podStartE2EDuration="2m18.889076273s" podCreationTimestamp="2026-03-18 15:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:39:02.886783226 +0000 UTC m=+185.892957432" watchObservedRunningTime="2026-03-18 15:39:02.889076273 +0000 UTC m=+185.895250479" Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.892818 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cp998"] Mar 18 15:39:02 crc kubenswrapper[4696]: I0318 15:39:02.893050 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cp998" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.018981 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qzs2\" (UniqueName: \"kubernetes.io/projected/0c1d314f-dde9-4056-964f-4eb911306306-kube-api-access-4qzs2\") pod \"redhat-marketplace-cp998\" (UID: \"0c1d314f-dde9-4056-964f-4eb911306306\") " pod="openshift-marketplace/redhat-marketplace-cp998" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.019061 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c1d314f-dde9-4056-964f-4eb911306306-utilities\") pod \"redhat-marketplace-cp998\" (UID: \"0c1d314f-dde9-4056-964f-4eb911306306\") " pod="openshift-marketplace/redhat-marketplace-cp998" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.019094 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c1d314f-dde9-4056-964f-4eb911306306-catalog-content\") pod \"redhat-marketplace-cp998\" (UID: \"0c1d314f-dde9-4056-964f-4eb911306306\") " pod="openshift-marketplace/redhat-marketplace-cp998" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.025895 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.025871626 podStartE2EDuration="3.025871626s" podCreationTimestamp="2026-03-18 15:39:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:39:03.013794322 +0000 UTC m=+186.019968528" watchObservedRunningTime="2026-03-18 15:39:03.025871626 +0000 UTC m=+186.032045832" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.121155 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c1d314f-dde9-4056-964f-4eb911306306-utilities\") pod \"redhat-marketplace-cp998\" (UID: \"0c1d314f-dde9-4056-964f-4eb911306306\") " pod="openshift-marketplace/redhat-marketplace-cp998" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.121226 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c1d314f-dde9-4056-964f-4eb911306306-catalog-content\") pod \"redhat-marketplace-cp998\" (UID: \"0c1d314f-dde9-4056-964f-4eb911306306\") " pod="openshift-marketplace/redhat-marketplace-cp998" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.121311 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qzs2\" (UniqueName: \"kubernetes.io/projected/0c1d314f-dde9-4056-964f-4eb911306306-kube-api-access-4qzs2\") pod \"redhat-marketplace-cp998\" (UID: \"0c1d314f-dde9-4056-964f-4eb911306306\") " pod="openshift-marketplace/redhat-marketplace-cp998" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.122033 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c1d314f-dde9-4056-964f-4eb911306306-utilities\") pod \"redhat-marketplace-cp998\" (UID: \"0c1d314f-dde9-4056-964f-4eb911306306\") " pod="openshift-marketplace/redhat-marketplace-cp998" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.122307 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c1d314f-dde9-4056-964f-4eb911306306-catalog-content\") pod \"redhat-marketplace-cp998\" (UID: \"0c1d314f-dde9-4056-964f-4eb911306306\") " pod="openshift-marketplace/redhat-marketplace-cp998" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.162329 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qzs2\" (UniqueName: \"kubernetes.io/projected/0c1d314f-dde9-4056-964f-4eb911306306-kube-api-access-4qzs2\") pod \"redhat-marketplace-cp998\" (UID: \"0c1d314f-dde9-4056-964f-4eb911306306\") " pod="openshift-marketplace/redhat-marketplace-cp998" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.197711 4696 patch_prober.go:28] interesting pod/router-default-5444994796-xk9kb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:39:03 crc kubenswrapper[4696]: [-]has-synced failed: reason withheld Mar 18 15:39:03 crc kubenswrapper[4696]: [+]process-running ok Mar 18 15:39:03 crc kubenswrapper[4696]: healthz check failed Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.197765 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xk9kb" podUID="94443ebd-69c4-4f6b-90a6-13cd2da51741" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.219908 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nsq74"] Mar 18 15:39:03 crc kubenswrapper[4696]: W0318 15:39:03.240426 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9ddf1c7_0e1d_4f54_a93d_1665148569b2.slice/crio-5485c9cccc41913c6de6b33f3ff60bb9a62381c9b21bd31f04b4c0cd70f89dd3 WatchSource:0}: Error finding container 5485c9cccc41913c6de6b33f3ff60bb9a62381c9b21bd31f04b4c0cd70f89dd3: Status 404 returned error can't find the container with id 5485c9cccc41913c6de6b33f3ff60bb9a62381c9b21bd31f04b4c0cd70f89dd3 Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.320486 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cp998" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.427628 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6p846"] Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.433813 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6p846" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.436113 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.440190 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6p846"] Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.532353 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80160993-15c5-4eea-ac72-2094fe935ac1-utilities\") pod \"redhat-operators-6p846\" (UID: \"80160993-15c5-4eea-ac72-2094fe935ac1\") " pod="openshift-marketplace/redhat-operators-6p846" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.532448 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fczjg\" (UniqueName: \"kubernetes.io/projected/80160993-15c5-4eea-ac72-2094fe935ac1-kube-api-access-fczjg\") pod \"redhat-operators-6p846\" (UID: \"80160993-15c5-4eea-ac72-2094fe935ac1\") " pod="openshift-marketplace/redhat-operators-6p846" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.532736 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80160993-15c5-4eea-ac72-2094fe935ac1-catalog-content\") pod \"redhat-operators-6p846\" (UID: \"80160993-15c5-4eea-ac72-2094fe935ac1\") " pod="openshift-marketplace/redhat-operators-6p846" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.592380 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cp998"] Mar 18 15:39:03 crc kubenswrapper[4696]: W0318 15:39:03.602682 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c1d314f_dde9_4056_964f_4eb911306306.slice/crio-04e956342d36a1e04a03cb5211dd771d9225d1be409e1825872f4536a0d7e07b WatchSource:0}: Error finding container 04e956342d36a1e04a03cb5211dd771d9225d1be409e1825872f4536a0d7e07b: Status 404 returned error can't find the container with id 04e956342d36a1e04a03cb5211dd771d9225d1be409e1825872f4536a0d7e07b Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.634239 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80160993-15c5-4eea-ac72-2094fe935ac1-utilities\") pod \"redhat-operators-6p846\" (UID: \"80160993-15c5-4eea-ac72-2094fe935ac1\") " pod="openshift-marketplace/redhat-operators-6p846" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.634308 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fczjg\" (UniqueName: \"kubernetes.io/projected/80160993-15c5-4eea-ac72-2094fe935ac1-kube-api-access-fczjg\") pod \"redhat-operators-6p846\" (UID: \"80160993-15c5-4eea-ac72-2094fe935ac1\") " pod="openshift-marketplace/redhat-operators-6p846" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.634401 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80160993-15c5-4eea-ac72-2094fe935ac1-catalog-content\") pod \"redhat-operators-6p846\" (UID: \"80160993-15c5-4eea-ac72-2094fe935ac1\") " pod="openshift-marketplace/redhat-operators-6p846" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.634938 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80160993-15c5-4eea-ac72-2094fe935ac1-utilities\") pod \"redhat-operators-6p846\" (UID: \"80160993-15c5-4eea-ac72-2094fe935ac1\") " pod="openshift-marketplace/redhat-operators-6p846" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.634969 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80160993-15c5-4eea-ac72-2094fe935ac1-catalog-content\") pod \"redhat-operators-6p846\" (UID: \"80160993-15c5-4eea-ac72-2094fe935ac1\") " pod="openshift-marketplace/redhat-operators-6p846" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.666995 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fczjg\" (UniqueName: \"kubernetes.io/projected/80160993-15c5-4eea-ac72-2094fe935ac1-kube-api-access-fczjg\") pod \"redhat-operators-6p846\" (UID: \"80160993-15c5-4eea-ac72-2094fe935ac1\") " pod="openshift-marketplace/redhat-operators-6p846" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.739904 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cp998" event={"ID":"0c1d314f-dde9-4056-964f-4eb911306306","Type":"ContainerStarted","Data":"04e956342d36a1e04a03cb5211dd771d9225d1be409e1825872f4536a0d7e07b"} Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.743058 4696 generic.go:334] "Generic (PLEG): container finished" podID="65eafdb4-acf5-4d6d-8e11-792ea23cf5a8" containerID="9da040c5b28d8cc46094b117e28850006b8dfcaa3e84ff5b5fd10e359c45602f" exitCode=0 Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.743144 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"65eafdb4-acf5-4d6d-8e11-792ea23cf5a8","Type":"ContainerDied","Data":"9da040c5b28d8cc46094b117e28850006b8dfcaa3e84ff5b5fd10e359c45602f"} Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.747657 4696 generic.go:334] "Generic (PLEG): container finished" podID="c9ddf1c7-0e1d-4f54-a93d-1665148569b2" containerID="a2f3b78ebaee0a446e890da4ddbd789d3b459832e681023e4703ef94b863df04" exitCode=0 Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.747940 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nsq74" event={"ID":"c9ddf1c7-0e1d-4f54-a93d-1665148569b2","Type":"ContainerDied","Data":"a2f3b78ebaee0a446e890da4ddbd789d3b459832e681023e4703ef94b863df04"} Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.748000 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nsq74" event={"ID":"c9ddf1c7-0e1d-4f54-a93d-1665148569b2","Type":"ContainerStarted","Data":"5485c9cccc41913c6de6b33f3ff60bb9a62381c9b21bd31f04b4c0cd70f89dd3"} Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.806003 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6p846" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.827015 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q82hl"] Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.829552 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q82hl" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.838217 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q82hl"] Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.940927 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfaa1769-2588-440e-9757-ded58dcb0ac3-utilities\") pod \"redhat-operators-q82hl\" (UID: \"cfaa1769-2588-440e-9757-ded58dcb0ac3\") " pod="openshift-marketplace/redhat-operators-q82hl" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.940974 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfaa1769-2588-440e-9757-ded58dcb0ac3-catalog-content\") pod \"redhat-operators-q82hl\" (UID: \"cfaa1769-2588-440e-9757-ded58dcb0ac3\") " pod="openshift-marketplace/redhat-operators-q82hl" Mar 18 15:39:03 crc kubenswrapper[4696]: I0318 15:39:03.941022 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddhzn\" (UniqueName: \"kubernetes.io/projected/cfaa1769-2588-440e-9757-ded58dcb0ac3-kube-api-access-ddhzn\") pod \"redhat-operators-q82hl\" (UID: \"cfaa1769-2588-440e-9757-ded58dcb0ac3\") " pod="openshift-marketplace/redhat-operators-q82hl" Mar 18 15:39:04 crc kubenswrapper[4696]: I0318 15:39:04.042963 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfaa1769-2588-440e-9757-ded58dcb0ac3-catalog-content\") pod \"redhat-operators-q82hl\" (UID: \"cfaa1769-2588-440e-9757-ded58dcb0ac3\") " pod="openshift-marketplace/redhat-operators-q82hl" Mar 18 15:39:04 crc kubenswrapper[4696]: I0318 15:39:04.043631 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddhzn\" (UniqueName: \"kubernetes.io/projected/cfaa1769-2588-440e-9757-ded58dcb0ac3-kube-api-access-ddhzn\") pod \"redhat-operators-q82hl\" (UID: \"cfaa1769-2588-440e-9757-ded58dcb0ac3\") " pod="openshift-marketplace/redhat-operators-q82hl" Mar 18 15:39:04 crc kubenswrapper[4696]: I0318 15:39:04.043732 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfaa1769-2588-440e-9757-ded58dcb0ac3-catalog-content\") pod \"redhat-operators-q82hl\" (UID: \"cfaa1769-2588-440e-9757-ded58dcb0ac3\") " pod="openshift-marketplace/redhat-operators-q82hl" Mar 18 15:39:04 crc kubenswrapper[4696]: I0318 15:39:04.043759 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfaa1769-2588-440e-9757-ded58dcb0ac3-utilities\") pod \"redhat-operators-q82hl\" (UID: \"cfaa1769-2588-440e-9757-ded58dcb0ac3\") " pod="openshift-marketplace/redhat-operators-q82hl" Mar 18 15:39:04 crc kubenswrapper[4696]: I0318 15:39:04.044172 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfaa1769-2588-440e-9757-ded58dcb0ac3-utilities\") pod \"redhat-operators-q82hl\" (UID: \"cfaa1769-2588-440e-9757-ded58dcb0ac3\") " pod="openshift-marketplace/redhat-operators-q82hl" Mar 18 15:39:04 crc kubenswrapper[4696]: I0318 15:39:04.063138 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddhzn\" (UniqueName: \"kubernetes.io/projected/cfaa1769-2588-440e-9757-ded58dcb0ac3-kube-api-access-ddhzn\") pod \"redhat-operators-q82hl\" (UID: \"cfaa1769-2588-440e-9757-ded58dcb0ac3\") " pod="openshift-marketplace/redhat-operators-q82hl" Mar 18 15:39:04 crc kubenswrapper[4696]: I0318 15:39:04.209278 4696 patch_prober.go:28] interesting pod/router-default-5444994796-xk9kb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:39:04 crc kubenswrapper[4696]: [-]has-synced failed: reason withheld Mar 18 15:39:04 crc kubenswrapper[4696]: [+]process-running ok Mar 18 15:39:04 crc kubenswrapper[4696]: healthz check failed Mar 18 15:39:04 crc kubenswrapper[4696]: I0318 15:39:04.209953 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xk9kb" podUID="94443ebd-69c4-4f6b-90a6-13cd2da51741" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:39:04 crc kubenswrapper[4696]: I0318 15:39:04.230153 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q82hl" Mar 18 15:39:04 crc kubenswrapper[4696]: I0318 15:39:04.399640 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6p846"] Mar 18 15:39:04 crc kubenswrapper[4696]: I0318 15:39:04.450021 4696 ???:1] "http: TLS handshake error from 192.168.126.11:44968: no serving certificate available for the kubelet" Mar 18 15:39:04 crc kubenswrapper[4696]: I0318 15:39:04.716310 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q82hl"] Mar 18 15:39:04 crc kubenswrapper[4696]: W0318 15:39:04.751128 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfaa1769_2588_440e_9757_ded58dcb0ac3.slice/crio-760e7def20f0e7e5d9f10888e3613f302cd10237ecd26c6e7ea21722e1dacb4e WatchSource:0}: Error finding container 760e7def20f0e7e5d9f10888e3613f302cd10237ecd26c6e7ea21722e1dacb4e: Status 404 returned error can't find the container with id 760e7def20f0e7e5d9f10888e3613f302cd10237ecd26c6e7ea21722e1dacb4e Mar 18 15:39:04 crc kubenswrapper[4696]: I0318 15:39:04.767266 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p846" event={"ID":"80160993-15c5-4eea-ac72-2094fe935ac1","Type":"ContainerStarted","Data":"0de4735b60cbe9c198533dd81efe1303a5220c5d6109bfd2e986ca272ad80608"} Mar 18 15:39:04 crc kubenswrapper[4696]: I0318 15:39:04.771603 4696 generic.go:334] "Generic (PLEG): container finished" podID="0c1d314f-dde9-4056-964f-4eb911306306" containerID="6907d3f478ae9a354d8ab82d2ef1828baf9eea72a4227fa132d734c848ca50ac" exitCode=0 Mar 18 15:39:04 crc kubenswrapper[4696]: I0318 15:39:04.771676 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cp998" event={"ID":"0c1d314f-dde9-4056-964f-4eb911306306","Type":"ContainerDied","Data":"6907d3f478ae9a354d8ab82d2ef1828baf9eea72a4227fa132d734c848ca50ac"} Mar 18 15:39:05 crc kubenswrapper[4696]: I0318 15:39:05.127731 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:39:05 crc kubenswrapper[4696]: I0318 15:39:05.169942 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65eafdb4-acf5-4d6d-8e11-792ea23cf5a8-kube-api-access\") pod \"65eafdb4-acf5-4d6d-8e11-792ea23cf5a8\" (UID: \"65eafdb4-acf5-4d6d-8e11-792ea23cf5a8\") " Mar 18 15:39:05 crc kubenswrapper[4696]: I0318 15:39:05.170023 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65eafdb4-acf5-4d6d-8e11-792ea23cf5a8-kubelet-dir\") pod \"65eafdb4-acf5-4d6d-8e11-792ea23cf5a8\" (UID: \"65eafdb4-acf5-4d6d-8e11-792ea23cf5a8\") " Mar 18 15:39:05 crc kubenswrapper[4696]: I0318 15:39:05.170122 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65eafdb4-acf5-4d6d-8e11-792ea23cf5a8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "65eafdb4-acf5-4d6d-8e11-792ea23cf5a8" (UID: "65eafdb4-acf5-4d6d-8e11-792ea23cf5a8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:39:05 crc kubenswrapper[4696]: I0318 15:39:05.170390 4696 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65eafdb4-acf5-4d6d-8e11-792ea23cf5a8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:05 crc kubenswrapper[4696]: I0318 15:39:05.192014 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65eafdb4-acf5-4d6d-8e11-792ea23cf5a8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "65eafdb4-acf5-4d6d-8e11-792ea23cf5a8" (UID: "65eafdb4-acf5-4d6d-8e11-792ea23cf5a8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:39:05 crc kubenswrapper[4696]: I0318 15:39:05.194116 4696 patch_prober.go:28] interesting pod/router-default-5444994796-xk9kb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:39:05 crc kubenswrapper[4696]: [-]has-synced failed: reason withheld Mar 18 15:39:05 crc kubenswrapper[4696]: [+]process-running ok Mar 18 15:39:05 crc kubenswrapper[4696]: healthz check failed Mar 18 15:39:05 crc kubenswrapper[4696]: I0318 15:39:05.194184 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xk9kb" podUID="94443ebd-69c4-4f6b-90a6-13cd2da51741" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:39:05 crc kubenswrapper[4696]: I0318 15:39:05.271363 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65eafdb4-acf5-4d6d-8e11-792ea23cf5a8-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:05 crc kubenswrapper[4696]: I0318 15:39:05.785330 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:39:05 crc kubenswrapper[4696]: I0318 15:39:05.785512 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"65eafdb4-acf5-4d6d-8e11-792ea23cf5a8","Type":"ContainerDied","Data":"0e0346ac61c91cf5fdab8ce141ab1ac755d84e8d1c8688f6937b153be57bce64"} Mar 18 15:39:05 crc kubenswrapper[4696]: I0318 15:39:05.785574 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e0346ac61c91cf5fdab8ce141ab1ac755d84e8d1c8688f6937b153be57bce64" Mar 18 15:39:05 crc kubenswrapper[4696]: I0318 15:39:05.789175 4696 generic.go:334] "Generic (PLEG): container finished" podID="cfaa1769-2588-440e-9757-ded58dcb0ac3" containerID="7912e162910f43ef4c70d5a9890c1141577da6d18a02aac38dc12be72b5ce356" exitCode=0 Mar 18 15:39:05 crc kubenswrapper[4696]: I0318 15:39:05.789251 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q82hl" event={"ID":"cfaa1769-2588-440e-9757-ded58dcb0ac3","Type":"ContainerDied","Data":"7912e162910f43ef4c70d5a9890c1141577da6d18a02aac38dc12be72b5ce356"} Mar 18 15:39:05 crc kubenswrapper[4696]: I0318 15:39:05.789271 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q82hl" event={"ID":"cfaa1769-2588-440e-9757-ded58dcb0ac3","Type":"ContainerStarted","Data":"760e7def20f0e7e5d9f10888e3613f302cd10237ecd26c6e7ea21722e1dacb4e"} Mar 18 15:39:05 crc kubenswrapper[4696]: I0318 15:39:05.805941 4696 generic.go:334] "Generic (PLEG): container finished" podID="80160993-15c5-4eea-ac72-2094fe935ac1" containerID="6c9b16f0e7f7a5bd4a2d449b0acec80c018be612f776daab6ed9ac580b2b339c" exitCode=0 Mar 18 15:39:05 crc kubenswrapper[4696]: I0318 15:39:05.806012 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p846" event={"ID":"80160993-15c5-4eea-ac72-2094fe935ac1","Type":"ContainerDied","Data":"6c9b16f0e7f7a5bd4a2d449b0acec80c018be612f776daab6ed9ac580b2b339c"} Mar 18 15:39:06 crc kubenswrapper[4696]: I0318 15:39:06.194588 4696 patch_prober.go:28] interesting pod/router-default-5444994796-xk9kb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:39:06 crc kubenswrapper[4696]: [-]has-synced failed: reason withheld Mar 18 15:39:06 crc kubenswrapper[4696]: [+]process-running ok Mar 18 15:39:06 crc kubenswrapper[4696]: healthz check failed Mar 18 15:39:06 crc kubenswrapper[4696]: I0318 15:39:06.194671 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xk9kb" podUID="94443ebd-69c4-4f6b-90a6-13cd2da51741" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:39:06 crc kubenswrapper[4696]: I0318 15:39:06.195342 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 15:39:06 crc kubenswrapper[4696]: E0318 15:39:06.195656 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65eafdb4-acf5-4d6d-8e11-792ea23cf5a8" containerName="pruner" Mar 18 15:39:06 crc kubenswrapper[4696]: I0318 15:39:06.195726 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="65eafdb4-acf5-4d6d-8e11-792ea23cf5a8" containerName="pruner" Mar 18 15:39:06 crc kubenswrapper[4696]: I0318 15:39:06.195881 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="65eafdb4-acf5-4d6d-8e11-792ea23cf5a8" containerName="pruner" Mar 18 15:39:06 crc kubenswrapper[4696]: I0318 15:39:06.196408 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:39:06 crc kubenswrapper[4696]: I0318 15:39:06.198837 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 15:39:06 crc kubenswrapper[4696]: I0318 15:39:06.199126 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 15:39:06 crc kubenswrapper[4696]: I0318 15:39:06.212956 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 15:39:06 crc kubenswrapper[4696]: I0318 15:39:06.287859 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/816ad4e6-f60f-4487-a12b-529fc685d209-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"816ad4e6-f60f-4487-a12b-529fc685d209\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:39:06 crc kubenswrapper[4696]: I0318 15:39:06.287953 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/816ad4e6-f60f-4487-a12b-529fc685d209-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"816ad4e6-f60f-4487-a12b-529fc685d209\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:39:06 crc kubenswrapper[4696]: I0318 15:39:06.390048 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/816ad4e6-f60f-4487-a12b-529fc685d209-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"816ad4e6-f60f-4487-a12b-529fc685d209\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:39:06 crc kubenswrapper[4696]: I0318 15:39:06.390130 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/816ad4e6-f60f-4487-a12b-529fc685d209-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"816ad4e6-f60f-4487-a12b-529fc685d209\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:39:06 crc kubenswrapper[4696]: I0318 15:39:06.390217 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/816ad4e6-f60f-4487-a12b-529fc685d209-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"816ad4e6-f60f-4487-a12b-529fc685d209\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:39:06 crc kubenswrapper[4696]: I0318 15:39:06.414080 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/816ad4e6-f60f-4487-a12b-529fc685d209-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"816ad4e6-f60f-4487-a12b-529fc685d209\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:39:06 crc kubenswrapper[4696]: I0318 15:39:06.521723 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:39:07 crc kubenswrapper[4696]: I0318 15:39:07.145400 4696 ???:1] "http: TLS handshake error from 192.168.126.11:44984: no serving certificate available for the kubelet" Mar 18 15:39:07 crc kubenswrapper[4696]: I0318 15:39:07.192674 4696 patch_prober.go:28] interesting pod/router-default-5444994796-xk9kb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:39:07 crc kubenswrapper[4696]: [-]has-synced failed: reason withheld Mar 18 15:39:07 crc kubenswrapper[4696]: [+]process-running ok Mar 18 15:39:07 crc kubenswrapper[4696]: healthz check failed Mar 18 15:39:07 crc kubenswrapper[4696]: I0318 15:39:07.192752 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xk9kb" podUID="94443ebd-69c4-4f6b-90a6-13cd2da51741" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:39:07 crc kubenswrapper[4696]: I0318 15:39:07.724691 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8286f" Mar 18 15:39:08 crc kubenswrapper[4696]: I0318 15:39:08.191269 4696 patch_prober.go:28] interesting pod/router-default-5444994796-xk9kb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:39:08 crc kubenswrapper[4696]: [-]has-synced failed: reason withheld Mar 18 15:39:08 crc kubenswrapper[4696]: [+]process-running ok Mar 18 15:39:08 crc kubenswrapper[4696]: healthz check failed Mar 18 15:39:08 crc kubenswrapper[4696]: I0318 15:39:08.191336 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xk9kb" podUID="94443ebd-69c4-4f6b-90a6-13cd2da51741" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:39:09 crc kubenswrapper[4696]: I0318 15:39:09.199347 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-xk9kb" Mar 18 15:39:09 crc kubenswrapper[4696]: I0318 15:39:09.208549 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-xk9kb" Mar 18 15:39:12 crc kubenswrapper[4696]: I0318 15:39:12.122594 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:39:12 crc kubenswrapper[4696]: I0318 15:39:12.128867 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:39:12 crc kubenswrapper[4696]: I0318 15:39:12.205950 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-498px" Mar 18 15:39:15 crc kubenswrapper[4696]: I0318 15:39:15.026810 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b59fd8595-dnvzb"] Mar 18 15:39:15 crc kubenswrapper[4696]: I0318 15:39:15.034889 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" podUID="3afda99d-4887-4fc9-9992-ac10eea0142b" containerName="controller-manager" containerID="cri-o://0640d3ea7180d068688544090553dc23d1c63faccb138278813f0f6eee40ca21" gracePeriod=30 Mar 18 15:39:15 crc kubenswrapper[4696]: I0318 15:39:15.039703 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk"] Mar 18 15:39:15 crc kubenswrapper[4696]: I0318 15:39:15.039953 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" podUID="eb66a6d0-42a1-4e17-8478-17f0b32a2369" containerName="route-controller-manager" containerID="cri-o://e227c4f4f6b2d48ef02fb017a5b8d7aa6fabc1a456fbc7b5aafc87772bfe3b06" gracePeriod=30 Mar 18 15:39:15 crc kubenswrapper[4696]: I0318 15:39:15.893633 4696 generic.go:334] "Generic (PLEG): container finished" podID="3afda99d-4887-4fc9-9992-ac10eea0142b" containerID="0640d3ea7180d068688544090553dc23d1c63faccb138278813f0f6eee40ca21" exitCode=0 Mar 18 15:39:15 crc kubenswrapper[4696]: I0318 15:39:15.893875 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" event={"ID":"3afda99d-4887-4fc9-9992-ac10eea0142b","Type":"ContainerDied","Data":"0640d3ea7180d068688544090553dc23d1c63faccb138278813f0f6eee40ca21"} Mar 18 15:39:15 crc kubenswrapper[4696]: I0318 15:39:15.899707 4696 generic.go:334] "Generic (PLEG): container finished" podID="eb66a6d0-42a1-4e17-8478-17f0b32a2369" containerID="e227c4f4f6b2d48ef02fb017a5b8d7aa6fabc1a456fbc7b5aafc87772bfe3b06" exitCode=0 Mar 18 15:39:15 crc kubenswrapper[4696]: I0318 15:39:15.899786 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" event={"ID":"eb66a6d0-42a1-4e17-8478-17f0b32a2369","Type":"ContainerDied","Data":"e227c4f4f6b2d48ef02fb017a5b8d7aa6fabc1a456fbc7b5aafc87772bfe3b06"} Mar 18 15:39:17 crc kubenswrapper[4696]: I0318 15:39:17.784396 4696 patch_prober.go:28] interesting pod/route-controller-manager-5cf487b88-2zcgk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 18 15:39:17 crc kubenswrapper[4696]: I0318 15:39:17.784455 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" podUID="eb66a6d0-42a1-4e17-8478-17f0b32a2369" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 18 15:39:20 crc kubenswrapper[4696]: I0318 15:39:20.156646 4696 patch_prober.go:28] interesting pod/controller-manager-7b59fd8595-dnvzb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" start-of-body= Mar 18 15:39:20 crc kubenswrapper[4696]: I0318 15:39:20.156969 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" podUID="3afda99d-4887-4fc9-9992-ac10eea0142b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" Mar 18 15:39:21 crc kubenswrapper[4696]: I0318 15:39:21.510612 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:39:24 crc kubenswrapper[4696]: I0318 15:39:24.841606 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:24 crc kubenswrapper[4696]: I0318 15:39:24.842308 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:24 crc kubenswrapper[4696]: I0318 15:39:24.843691 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 15:39:24 crc kubenswrapper[4696]: I0318 15:39:24.845972 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 15:39:24 crc kubenswrapper[4696]: I0318 15:39:24.858170 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:24 crc kubenswrapper[4696]: I0318 15:39:24.911921 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:24 crc kubenswrapper[4696]: I0318 15:39:24.943988 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:24 crc kubenswrapper[4696]: I0318 15:39:24.944091 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:24 crc kubenswrapper[4696]: I0318 15:39:24.946574 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 15:39:24 crc kubenswrapper[4696]: I0318 15:39:24.956564 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 15:39:24 crc kubenswrapper[4696]: I0318 15:39:24.969011 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:24 crc kubenswrapper[4696]: I0318 15:39:24.981594 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:25 crc kubenswrapper[4696]: I0318 15:39:25.119760 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:25 crc kubenswrapper[4696]: I0318 15:39:25.128239 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:25 crc kubenswrapper[4696]: I0318 15:39:25.136688 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:26 crc kubenswrapper[4696]: E0318 15:39:26.196608 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 18 15:39:26 crc kubenswrapper[4696]: E0318 15:39:26.196808 4696 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 15:39:26 crc kubenswrapper[4696]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 18 15:39:26 crc kubenswrapper[4696]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p7ftd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29564138-jzn9j_openshift-infra(854ec8b0-a321-4bcb-9327-96742fec3f31): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 18 15:39:26 crc kubenswrapper[4696]: > logger="UnhandledError" Mar 18 15:39:26 crc kubenswrapper[4696]: E0318 15:39:26.197893 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29564138-jzn9j" podUID="854ec8b0-a321-4bcb-9327-96742fec3f31" Mar 18 15:39:26 crc kubenswrapper[4696]: E0318 15:39:26.961858 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29564138-jzn9j" podUID="854ec8b0-a321-4bcb-9327-96742fec3f31" Mar 18 15:39:27 crc kubenswrapper[4696]: I0318 15:39:27.650123 4696 ???:1] "http: TLS handshake error from 192.168.126.11:53568: no serving certificate available for the kubelet" Mar 18 15:39:28 crc kubenswrapper[4696]: I0318 15:39:28.791480 4696 patch_prober.go:28] interesting pod/route-controller-manager-5cf487b88-2zcgk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:39:28 crc kubenswrapper[4696]: I0318 15:39:28.791830 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" podUID="eb66a6d0-42a1-4e17-8478-17f0b32a2369" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.693296 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.695837 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.724792 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6585cd7b87-prc4l"] Mar 18 15:39:30 crc kubenswrapper[4696]: E0318 15:39:30.729751 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3afda99d-4887-4fc9-9992-ac10eea0142b" containerName="controller-manager" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.729788 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3afda99d-4887-4fc9-9992-ac10eea0142b" containerName="controller-manager" Mar 18 15:39:30 crc kubenswrapper[4696]: E0318 15:39:30.729816 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb66a6d0-42a1-4e17-8478-17f0b32a2369" containerName="route-controller-manager" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.729824 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb66a6d0-42a1-4e17-8478-17f0b32a2369" containerName="route-controller-manager" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.729949 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb66a6d0-42a1-4e17-8478-17f0b32a2369" containerName="route-controller-manager" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.729964 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="3afda99d-4887-4fc9-9992-ac10eea0142b" containerName="controller-manager" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.730556 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.736127 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6585cd7b87-prc4l"] Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.827987 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb66a6d0-42a1-4e17-8478-17f0b32a2369-client-ca\") pod \"eb66a6d0-42a1-4e17-8478-17f0b32a2369\" (UID: \"eb66a6d0-42a1-4e17-8478-17f0b32a2369\") " Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.828072 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb66a6d0-42a1-4e17-8478-17f0b32a2369-config\") pod \"eb66a6d0-42a1-4e17-8478-17f0b32a2369\" (UID: \"eb66a6d0-42a1-4e17-8478-17f0b32a2369\") " Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.828101 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3afda99d-4887-4fc9-9992-ac10eea0142b-serving-cert\") pod \"3afda99d-4887-4fc9-9992-ac10eea0142b\" (UID: \"3afda99d-4887-4fc9-9992-ac10eea0142b\") " Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.828123 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grmsx\" (UniqueName: \"kubernetes.io/projected/3afda99d-4887-4fc9-9992-ac10eea0142b-kube-api-access-grmsx\") pod \"3afda99d-4887-4fc9-9992-ac10eea0142b\" (UID: \"3afda99d-4887-4fc9-9992-ac10eea0142b\") " Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.828144 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dz2v\" (UniqueName: \"kubernetes.io/projected/eb66a6d0-42a1-4e17-8478-17f0b32a2369-kube-api-access-9dz2v\") pod \"eb66a6d0-42a1-4e17-8478-17f0b32a2369\" (UID: \"eb66a6d0-42a1-4e17-8478-17f0b32a2369\") " Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.828197 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3afda99d-4887-4fc9-9992-ac10eea0142b-client-ca\") pod \"3afda99d-4887-4fc9-9992-ac10eea0142b\" (UID: \"3afda99d-4887-4fc9-9992-ac10eea0142b\") " Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.828215 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3afda99d-4887-4fc9-9992-ac10eea0142b-config\") pod \"3afda99d-4887-4fc9-9992-ac10eea0142b\" (UID: \"3afda99d-4887-4fc9-9992-ac10eea0142b\") " Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.828233 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb66a6d0-42a1-4e17-8478-17f0b32a2369-serving-cert\") pod \"eb66a6d0-42a1-4e17-8478-17f0b32a2369\" (UID: \"eb66a6d0-42a1-4e17-8478-17f0b32a2369\") " Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.828274 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3afda99d-4887-4fc9-9992-ac10eea0142b-proxy-ca-bundles\") pod \"3afda99d-4887-4fc9-9992-ac10eea0142b\" (UID: \"3afda99d-4887-4fc9-9992-ac10eea0142b\") " Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.828402 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c47ee76-0e0b-42f5-b824-be9c76bffd78-proxy-ca-bundles\") pod \"controller-manager-6585cd7b87-prc4l\" (UID: \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\") " pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.828458 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c47ee76-0e0b-42f5-b824-be9c76bffd78-client-ca\") pod \"controller-manager-6585cd7b87-prc4l\" (UID: \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\") " pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.828475 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c47ee76-0e0b-42f5-b824-be9c76bffd78-config\") pod \"controller-manager-6585cd7b87-prc4l\" (UID: \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\") " pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.828508 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm5vq\" (UniqueName: \"kubernetes.io/projected/2c47ee76-0e0b-42f5-b824-be9c76bffd78-kube-api-access-vm5vq\") pod \"controller-manager-6585cd7b87-prc4l\" (UID: \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\") " pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.828562 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c47ee76-0e0b-42f5-b824-be9c76bffd78-serving-cert\") pod \"controller-manager-6585cd7b87-prc4l\" (UID: \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\") " pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.829693 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3afda99d-4887-4fc9-9992-ac10eea0142b-client-ca" (OuterVolumeSpecName: "client-ca") pod "3afda99d-4887-4fc9-9992-ac10eea0142b" (UID: "3afda99d-4887-4fc9-9992-ac10eea0142b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.829884 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3afda99d-4887-4fc9-9992-ac10eea0142b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3afda99d-4887-4fc9-9992-ac10eea0142b" (UID: "3afda99d-4887-4fc9-9992-ac10eea0142b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.830036 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb66a6d0-42a1-4e17-8478-17f0b32a2369-client-ca" (OuterVolumeSpecName: "client-ca") pod "eb66a6d0-42a1-4e17-8478-17f0b32a2369" (UID: "eb66a6d0-42a1-4e17-8478-17f0b32a2369"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.830070 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb66a6d0-42a1-4e17-8478-17f0b32a2369-config" (OuterVolumeSpecName: "config") pod "eb66a6d0-42a1-4e17-8478-17f0b32a2369" (UID: "eb66a6d0-42a1-4e17-8478-17f0b32a2369"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.830203 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3afda99d-4887-4fc9-9992-ac10eea0142b-config" (OuterVolumeSpecName: "config") pod "3afda99d-4887-4fc9-9992-ac10eea0142b" (UID: "3afda99d-4887-4fc9-9992-ac10eea0142b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.834677 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb66a6d0-42a1-4e17-8478-17f0b32a2369-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "eb66a6d0-42a1-4e17-8478-17f0b32a2369" (UID: "eb66a6d0-42a1-4e17-8478-17f0b32a2369"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.834729 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3afda99d-4887-4fc9-9992-ac10eea0142b-kube-api-access-grmsx" (OuterVolumeSpecName: "kube-api-access-grmsx") pod "3afda99d-4887-4fc9-9992-ac10eea0142b" (UID: "3afda99d-4887-4fc9-9992-ac10eea0142b"). InnerVolumeSpecName "kube-api-access-grmsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.834820 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb66a6d0-42a1-4e17-8478-17f0b32a2369-kube-api-access-9dz2v" (OuterVolumeSpecName: "kube-api-access-9dz2v") pod "eb66a6d0-42a1-4e17-8478-17f0b32a2369" (UID: "eb66a6d0-42a1-4e17-8478-17f0b32a2369"). InnerVolumeSpecName "kube-api-access-9dz2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.844656 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afda99d-4887-4fc9-9992-ac10eea0142b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3afda99d-4887-4fc9-9992-ac10eea0142b" (UID: "3afda99d-4887-4fc9-9992-ac10eea0142b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.930248 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c47ee76-0e0b-42f5-b824-be9c76bffd78-client-ca\") pod \"controller-manager-6585cd7b87-prc4l\" (UID: \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\") " pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.930318 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c47ee76-0e0b-42f5-b824-be9c76bffd78-config\") pod \"controller-manager-6585cd7b87-prc4l\" (UID: \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\") " pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.930369 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm5vq\" (UniqueName: \"kubernetes.io/projected/2c47ee76-0e0b-42f5-b824-be9c76bffd78-kube-api-access-vm5vq\") pod \"controller-manager-6585cd7b87-prc4l\" (UID: \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\") " pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.930410 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c47ee76-0e0b-42f5-b824-be9c76bffd78-serving-cert\") pod \"controller-manager-6585cd7b87-prc4l\" (UID: \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\") " pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.930442 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c47ee76-0e0b-42f5-b824-be9c76bffd78-proxy-ca-bundles\") pod \"controller-manager-6585cd7b87-prc4l\" (UID: \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\") " pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.930502 4696 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3afda99d-4887-4fc9-9992-ac10eea0142b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.930534 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb66a6d0-42a1-4e17-8478-17f0b32a2369-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.930548 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3afda99d-4887-4fc9-9992-ac10eea0142b-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.930560 4696 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3afda99d-4887-4fc9-9992-ac10eea0142b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.930577 4696 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb66a6d0-42a1-4e17-8478-17f0b32a2369-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.930707 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb66a6d0-42a1-4e17-8478-17f0b32a2369-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.930721 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3afda99d-4887-4fc9-9992-ac10eea0142b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.930733 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grmsx\" (UniqueName: \"kubernetes.io/projected/3afda99d-4887-4fc9-9992-ac10eea0142b-kube-api-access-grmsx\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.930745 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dz2v\" (UniqueName: \"kubernetes.io/projected/eb66a6d0-42a1-4e17-8478-17f0b32a2369-kube-api-access-9dz2v\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.931385 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c47ee76-0e0b-42f5-b824-be9c76bffd78-client-ca\") pod \"controller-manager-6585cd7b87-prc4l\" (UID: \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\") " pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.931801 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c47ee76-0e0b-42f5-b824-be9c76bffd78-proxy-ca-bundles\") pod \"controller-manager-6585cd7b87-prc4l\" (UID: \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\") " pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.933331 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c47ee76-0e0b-42f5-b824-be9c76bffd78-config\") pod \"controller-manager-6585cd7b87-prc4l\" (UID: \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\") " pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.935731 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c47ee76-0e0b-42f5-b824-be9c76bffd78-serving-cert\") pod \"controller-manager-6585cd7b87-prc4l\" (UID: \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\") " pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.947840 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm5vq\" (UniqueName: \"kubernetes.io/projected/2c47ee76-0e0b-42f5-b824-be9c76bffd78-kube-api-access-vm5vq\") pod \"controller-manager-6585cd7b87-prc4l\" (UID: \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\") " pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.983631 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" event={"ID":"3afda99d-4887-4fc9-9992-ac10eea0142b","Type":"ContainerDied","Data":"124bba3a98a29413776748357083dec74ef312cf77d08a3edbcbe127cd478527"} Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.983680 4696 scope.go:117] "RemoveContainer" containerID="0640d3ea7180d068688544090553dc23d1c63faccb138278813f0f6eee40ca21" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.983795 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.986809 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" event={"ID":"eb66a6d0-42a1-4e17-8478-17f0b32a2369","Type":"ContainerDied","Data":"f291e750a473bd5e7ac32aec24157bccf5852ae4af76e984c6daf8dcf025046a"} Mar 18 15:39:30 crc kubenswrapper[4696]: I0318 15:39:30.986917 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk" Mar 18 15:39:31 crc kubenswrapper[4696]: I0318 15:39:31.015145 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b59fd8595-dnvzb"] Mar 18 15:39:31 crc kubenswrapper[4696]: I0318 15:39:31.018236 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7b59fd8595-dnvzb"] Mar 18 15:39:31 crc kubenswrapper[4696]: I0318 15:39:31.028481 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk"] Mar 18 15:39:31 crc kubenswrapper[4696]: I0318 15:39:31.031125 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cf487b88-2zcgk"] Mar 18 15:39:31 crc kubenswrapper[4696]: I0318 15:39:31.067374 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" Mar 18 15:39:31 crc kubenswrapper[4696]: I0318 15:39:31.156283 4696 patch_prober.go:28] interesting pod/controller-manager-7b59fd8595-dnvzb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:39:31 crc kubenswrapper[4696]: I0318 15:39:31.156349 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7b59fd8595-dnvzb" podUID="3afda99d-4887-4fc9-9992-ac10eea0142b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 15:39:31 crc kubenswrapper[4696]: I0318 15:39:31.604647 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3afda99d-4887-4fc9-9992-ac10eea0142b" path="/var/lib/kubelet/pods/3afda99d-4887-4fc9-9992-ac10eea0142b/volumes" Mar 18 15:39:31 crc kubenswrapper[4696]: I0318 15:39:31.605185 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb66a6d0-42a1-4e17-8478-17f0b32a2369" path="/var/lib/kubelet/pods/eb66a6d0-42a1-4e17-8478-17f0b32a2369/volumes" Mar 18 15:39:32 crc kubenswrapper[4696]: E0318 15:39:32.566504 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 18 15:39:32 crc kubenswrapper[4696]: E0318 15:39:32.566739 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2tz9b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-8tkdr_openshift-marketplace(4e22cc1d-032f-4f3a-a0ca-51708beef610): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 15:39:32 crc kubenswrapper[4696]: E0318 15:39:32.567912 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-8tkdr" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" Mar 18 15:39:32 crc kubenswrapper[4696]: I0318 15:39:32.820195 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-945745784-2j84f"] Mar 18 15:39:32 crc kubenswrapper[4696]: I0318 15:39:32.821017 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" Mar 18 15:39:32 crc kubenswrapper[4696]: I0318 15:39:32.826155 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 15:39:32 crc kubenswrapper[4696]: I0318 15:39:32.826399 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 15:39:32 crc kubenswrapper[4696]: I0318 15:39:32.827222 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 15:39:32 crc kubenswrapper[4696]: I0318 15:39:32.828030 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 15:39:32 crc kubenswrapper[4696]: I0318 15:39:32.828229 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 15:39:32 crc kubenswrapper[4696]: I0318 15:39:32.830897 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 15:39:32 crc kubenswrapper[4696]: I0318 15:39:32.838066 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-945745784-2j84f"] Mar 18 15:39:32 crc kubenswrapper[4696]: I0318 15:39:32.915498 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7p8b" Mar 18 15:39:33 crc kubenswrapper[4696]: I0318 15:39:33.008541 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtjvl\" (UniqueName: \"kubernetes.io/projected/9fc56b90-3630-479f-a9f0-87ed9c0de838-kube-api-access-dtjvl\") pod \"route-controller-manager-945745784-2j84f\" (UID: \"9fc56b90-3630-479f-a9f0-87ed9c0de838\") " pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" Mar 18 15:39:33 crc kubenswrapper[4696]: I0318 15:39:33.008594 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fc56b90-3630-479f-a9f0-87ed9c0de838-config\") pod \"route-controller-manager-945745784-2j84f\" (UID: \"9fc56b90-3630-479f-a9f0-87ed9c0de838\") " pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" Mar 18 15:39:33 crc kubenswrapper[4696]: I0318 15:39:33.008629 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fc56b90-3630-479f-a9f0-87ed9c0de838-serving-cert\") pod \"route-controller-manager-945745784-2j84f\" (UID: \"9fc56b90-3630-479f-a9f0-87ed9c0de838\") " pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" Mar 18 15:39:33 crc kubenswrapper[4696]: I0318 15:39:33.008693 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9fc56b90-3630-479f-a9f0-87ed9c0de838-client-ca\") pod \"route-controller-manager-945745784-2j84f\" (UID: \"9fc56b90-3630-479f-a9f0-87ed9c0de838\") " pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" Mar 18 15:39:33 crc kubenswrapper[4696]: I0318 15:39:33.109369 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtjvl\" (UniqueName: \"kubernetes.io/projected/9fc56b90-3630-479f-a9f0-87ed9c0de838-kube-api-access-dtjvl\") pod \"route-controller-manager-945745784-2j84f\" (UID: \"9fc56b90-3630-479f-a9f0-87ed9c0de838\") " pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" Mar 18 15:39:33 crc kubenswrapper[4696]: I0318 15:39:33.109441 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fc56b90-3630-479f-a9f0-87ed9c0de838-config\") pod \"route-controller-manager-945745784-2j84f\" (UID: \"9fc56b90-3630-479f-a9f0-87ed9c0de838\") " pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" Mar 18 15:39:33 crc kubenswrapper[4696]: I0318 15:39:33.109486 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fc56b90-3630-479f-a9f0-87ed9c0de838-serving-cert\") pod \"route-controller-manager-945745784-2j84f\" (UID: \"9fc56b90-3630-479f-a9f0-87ed9c0de838\") " pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" Mar 18 15:39:33 crc kubenswrapper[4696]: I0318 15:39:33.109545 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9fc56b90-3630-479f-a9f0-87ed9c0de838-client-ca\") pod \"route-controller-manager-945745784-2j84f\" (UID: \"9fc56b90-3630-479f-a9f0-87ed9c0de838\") " pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" Mar 18 15:39:33 crc kubenswrapper[4696]: I0318 15:39:33.110408 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9fc56b90-3630-479f-a9f0-87ed9c0de838-client-ca\") pod \"route-controller-manager-945745784-2j84f\" (UID: \"9fc56b90-3630-479f-a9f0-87ed9c0de838\") " pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" Mar 18 15:39:33 crc kubenswrapper[4696]: I0318 15:39:33.111487 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fc56b90-3630-479f-a9f0-87ed9c0de838-config\") pod \"route-controller-manager-945745784-2j84f\" (UID: \"9fc56b90-3630-479f-a9f0-87ed9c0de838\") " pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" Mar 18 15:39:33 crc kubenswrapper[4696]: I0318 15:39:33.115198 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fc56b90-3630-479f-a9f0-87ed9c0de838-serving-cert\") pod \"route-controller-manager-945745784-2j84f\" (UID: \"9fc56b90-3630-479f-a9f0-87ed9c0de838\") " pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" Mar 18 15:39:33 crc kubenswrapper[4696]: I0318 15:39:33.130202 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtjvl\" (UniqueName: \"kubernetes.io/projected/9fc56b90-3630-479f-a9f0-87ed9c0de838-kube-api-access-dtjvl\") pod \"route-controller-manager-945745784-2j84f\" (UID: \"9fc56b90-3630-479f-a9f0-87ed9c0de838\") " pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" Mar 18 15:39:33 crc kubenswrapper[4696]: I0318 15:39:33.224155 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" Mar 18 15:39:34 crc kubenswrapper[4696]: E0318 15:39:34.011072 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-8tkdr" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" Mar 18 15:39:34 crc kubenswrapper[4696]: E0318 15:39:34.075636 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 18 15:39:34 crc kubenswrapper[4696]: E0318 15:39:34.076160 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g4xdg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-sjnmp_openshift-marketplace(b0569eea-b948-4633-91d7-4ebfa02d5a8b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 15:39:34 crc kubenswrapper[4696]: E0318 15:39:34.077394 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-sjnmp" podUID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" Mar 18 15:39:35 crc kubenswrapper[4696]: I0318 15:39:35.001437 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 15:39:35 crc kubenswrapper[4696]: I0318 15:39:35.002665 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:39:35 crc kubenswrapper[4696]: I0318 15:39:35.012610 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6585cd7b87-prc4l"] Mar 18 15:39:35 crc kubenswrapper[4696]: I0318 15:39:35.036296 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 15:39:35 crc kubenswrapper[4696]: I0318 15:39:35.038603 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7cfa7a5-467e-40ae-a375-060b1435acaf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c7cfa7a5-467e-40ae-a375-060b1435acaf\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:39:35 crc kubenswrapper[4696]: I0318 15:39:35.038692 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7cfa7a5-467e-40ae-a375-060b1435acaf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c7cfa7a5-467e-40ae-a375-060b1435acaf\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:39:35 crc kubenswrapper[4696]: I0318 15:39:35.133331 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-945745784-2j84f"] Mar 18 15:39:35 crc kubenswrapper[4696]: I0318 15:39:35.139691 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7cfa7a5-467e-40ae-a375-060b1435acaf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c7cfa7a5-467e-40ae-a375-060b1435acaf\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:39:35 crc kubenswrapper[4696]: I0318 15:39:35.139760 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7cfa7a5-467e-40ae-a375-060b1435acaf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c7cfa7a5-467e-40ae-a375-060b1435acaf\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:39:35 crc kubenswrapper[4696]: I0318 15:39:35.140236 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7cfa7a5-467e-40ae-a375-060b1435acaf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c7cfa7a5-467e-40ae-a375-060b1435acaf\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:39:35 crc kubenswrapper[4696]: I0318 15:39:35.163589 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7cfa7a5-467e-40ae-a375-060b1435acaf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c7cfa7a5-467e-40ae-a375-060b1435acaf\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:39:35 crc kubenswrapper[4696]: I0318 15:39:35.329294 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:39:35 crc kubenswrapper[4696]: E0318 15:39:35.827744 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-sjnmp" podUID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" Mar 18 15:39:36 crc kubenswrapper[4696]: E0318 15:39:36.065570 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 18 15:39:36 crc kubenswrapper[4696]: E0318 15:39:36.065822 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-87qhb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-nsq74_openshift-marketplace(c9ddf1c7-0e1d-4f54-a93d-1665148569b2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 15:39:36 crc kubenswrapper[4696]: E0318 15:39:36.067097 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-nsq74" podUID="c9ddf1c7-0e1d-4f54-a93d-1665148569b2" Mar 18 15:39:39 crc kubenswrapper[4696]: E0318 15:39:39.443716 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-nsq74" podUID="c9ddf1c7-0e1d-4f54-a93d-1665148569b2" Mar 18 15:39:39 crc kubenswrapper[4696]: I0318 15:39:39.791133 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 15:39:39 crc kubenswrapper[4696]: I0318 15:39:39.809356 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:39:39 crc kubenswrapper[4696]: I0318 15:39:39.818888 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 15:39:39 crc kubenswrapper[4696]: I0318 15:39:39.911446 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/70ccac16-ae70-4ba3-9edf-76707d5643b1-var-lock\") pod \"installer-9-crc\" (UID: \"70ccac16-ae70-4ba3-9edf-76707d5643b1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:39:39 crc kubenswrapper[4696]: I0318 15:39:39.911729 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70ccac16-ae70-4ba3-9edf-76707d5643b1-kube-api-access\") pod \"installer-9-crc\" (UID: \"70ccac16-ae70-4ba3-9edf-76707d5643b1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:39:39 crc kubenswrapper[4696]: I0318 15:39:39.911769 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70ccac16-ae70-4ba3-9edf-76707d5643b1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"70ccac16-ae70-4ba3-9edf-76707d5643b1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:39:40 crc kubenswrapper[4696]: I0318 15:39:40.012971 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/70ccac16-ae70-4ba3-9edf-76707d5643b1-var-lock\") pod \"installer-9-crc\" (UID: \"70ccac16-ae70-4ba3-9edf-76707d5643b1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:39:40 crc kubenswrapper[4696]: I0318 15:39:40.013057 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70ccac16-ae70-4ba3-9edf-76707d5643b1-kube-api-access\") pod \"installer-9-crc\" (UID: \"70ccac16-ae70-4ba3-9edf-76707d5643b1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:39:40 crc kubenswrapper[4696]: I0318 15:39:40.013087 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70ccac16-ae70-4ba3-9edf-76707d5643b1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"70ccac16-ae70-4ba3-9edf-76707d5643b1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:39:40 crc kubenswrapper[4696]: I0318 15:39:40.013149 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/70ccac16-ae70-4ba3-9edf-76707d5643b1-var-lock\") pod \"installer-9-crc\" (UID: \"70ccac16-ae70-4ba3-9edf-76707d5643b1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:39:40 crc kubenswrapper[4696]: I0318 15:39:40.013194 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70ccac16-ae70-4ba3-9edf-76707d5643b1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"70ccac16-ae70-4ba3-9edf-76707d5643b1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:39:40 crc kubenswrapper[4696]: I0318 15:39:40.044205 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70ccac16-ae70-4ba3-9edf-76707d5643b1-kube-api-access\") pod \"installer-9-crc\" (UID: \"70ccac16-ae70-4ba3-9edf-76707d5643b1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:39:40 crc kubenswrapper[4696]: I0318 15:39:40.135494 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:39:40 crc kubenswrapper[4696]: E0318 15:39:40.973426 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 18 15:39:40 crc kubenswrapper[4696]: E0318 15:39:40.973984 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-27rp6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-xd5v7_openshift-marketplace(aa06cecb-1f9f-431d-933f-0e87033cd695): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 15:39:40 crc kubenswrapper[4696]: E0318 15:39:40.975201 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-xd5v7" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" Mar 18 15:39:41 crc kubenswrapper[4696]: E0318 15:39:41.063363 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xd5v7" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" Mar 18 15:39:41 crc kubenswrapper[4696]: I0318 15:39:41.126995 4696 scope.go:117] "RemoveContainer" containerID="e227c4f4f6b2d48ef02fb017a5b8d7aa6fabc1a456fbc7b5aafc87772bfe3b06" Mar 18 15:39:41 crc kubenswrapper[4696]: E0318 15:39:41.170947 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 18 15:39:41 crc kubenswrapper[4696]: E0318 15:39:41.171477 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-46xmj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-ldd6n_openshift-marketplace(459c4b74-f710-4b3a-b053-8b0326b87cb5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 15:39:41 crc kubenswrapper[4696]: E0318 15:39:41.172689 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-ldd6n" podUID="459c4b74-f710-4b3a-b053-8b0326b87cb5" Mar 18 15:39:41 crc kubenswrapper[4696]: E0318 15:39:41.202641 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 18 15:39:41 crc kubenswrapper[4696]: E0318 15:39:41.203156 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ddhzn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-q82hl_openshift-marketplace(cfaa1769-2588-440e-9757-ded58dcb0ac3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 15:39:41 crc kubenswrapper[4696]: E0318 15:39:41.204396 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-q82hl" podUID="cfaa1769-2588-440e-9757-ded58dcb0ac3" Mar 18 15:39:41 crc kubenswrapper[4696]: I0318 15:39:41.558241 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 15:39:41 crc kubenswrapper[4696]: W0318 15:39:41.581646 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod816ad4e6_f60f_4487_a12b_529fc685d209.slice/crio-e374743d3a38163186f173367fbc3198ae85c26d279c061fe55b712f335ac75b WatchSource:0}: Error finding container e374743d3a38163186f173367fbc3198ae85c26d279c061fe55b712f335ac75b: Status 404 returned error can't find the container with id e374743d3a38163186f173367fbc3198ae85c26d279c061fe55b712f335ac75b Mar 18 15:39:41 crc kubenswrapper[4696]: W0318 15:39:41.663250 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-793d2c368f8ccc6061fa66fd6fb2ae5307f29c352bfe01e5e5e12aa878c1403f WatchSource:0}: Error finding container 793d2c368f8ccc6061fa66fd6fb2ae5307f29c352bfe01e5e5e12aa878c1403f: Status 404 returned error can't find the container with id 793d2c368f8ccc6061fa66fd6fb2ae5307f29c352bfe01e5e5e12aa878c1403f Mar 18 15:39:41 crc kubenswrapper[4696]: I0318 15:39:41.731961 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 15:39:41 crc kubenswrapper[4696]: W0318 15:39:41.792144 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-80bbaa07f10089cc5bbdea4816dff456589c155763135bf2aad68e4c95862626 WatchSource:0}: Error finding container 80bbaa07f10089cc5bbdea4816dff456589c155763135bf2aad68e4c95862626: Status 404 returned error can't find the container with id 80bbaa07f10089cc5bbdea4816dff456589c155763135bf2aad68e4c95862626 Mar 18 15:39:41 crc kubenswrapper[4696]: I0318 15:39:41.834018 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 15:39:41 crc kubenswrapper[4696]: I0318 15:39:41.850914 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-945745784-2j84f"] Mar 18 15:39:41 crc kubenswrapper[4696]: W0318 15:39:41.866900 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc7cfa7a5_467e_40ae_a375_060b1435acaf.slice/crio-574056382a1f4510332330711bf9da1aa7ab3642060607c648610cb723c108aa WatchSource:0}: Error finding container 574056382a1f4510332330711bf9da1aa7ab3642060607c648610cb723c108aa: Status 404 returned error can't find the container with id 574056382a1f4510332330711bf9da1aa7ab3642060607c648610cb723c108aa Mar 18 15:39:41 crc kubenswrapper[4696]: I0318 15:39:41.873299 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6585cd7b87-prc4l"] Mar 18 15:39:42 crc kubenswrapper[4696]: I0318 15:39:42.077654 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" event={"ID":"2c47ee76-0e0b-42f5-b824-be9c76bffd78","Type":"ContainerStarted","Data":"d27c0d00bc292aaa8052a94c5f99f624648527dcc730ba2325f0adae5c436310"} Mar 18 15:39:42 crc kubenswrapper[4696]: I0318 15:39:42.081080 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p846" event={"ID":"80160993-15c5-4eea-ac72-2094fe935ac1","Type":"ContainerStarted","Data":"251fdb2378bb64a75ba20694ca9848b8c1927d4a67ff2b1b35dd5b6adb7c60ce"} Mar 18 15:39:42 crc kubenswrapper[4696]: I0318 15:39:42.083938 4696 generic.go:334] "Generic (PLEG): container finished" podID="0c1d314f-dde9-4056-964f-4eb911306306" containerID="f57af75a2d23f1374310594a4bb4d88cb0ff8d3b0801b0a531d4e216b0e18598" exitCode=0 Mar 18 15:39:42 crc kubenswrapper[4696]: I0318 15:39:42.084016 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cp998" event={"ID":"0c1d314f-dde9-4056-964f-4eb911306306","Type":"ContainerDied","Data":"f57af75a2d23f1374310594a4bb4d88cb0ff8d3b0801b0a531d4e216b0e18598"} Mar 18 15:39:42 crc kubenswrapper[4696]: I0318 15:39:42.086045 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c7cfa7a5-467e-40ae-a375-060b1435acaf","Type":"ContainerStarted","Data":"574056382a1f4510332330711bf9da1aa7ab3642060607c648610cb723c108aa"} Mar 18 15:39:42 crc kubenswrapper[4696]: I0318 15:39:42.127135 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"816ad4e6-f60f-4487-a12b-529fc685d209","Type":"ContainerStarted","Data":"1ffcdbea3b55d8710bc0618939d718fef0c48a1030b9a94568b0f38accdd5f3c"} Mar 18 15:39:42 crc kubenswrapper[4696]: I0318 15:39:42.127178 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"816ad4e6-f60f-4487-a12b-529fc685d209","Type":"ContainerStarted","Data":"e374743d3a38163186f173367fbc3198ae85c26d279c061fe55b712f335ac75b"} Mar 18 15:39:42 crc kubenswrapper[4696]: I0318 15:39:42.134225 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"d5cf7d26306351047f764cdb985b82f74fd6fdd81fac87f9debb3d84ac273954"} Mar 18 15:39:42 crc kubenswrapper[4696]: I0318 15:39:42.135595 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"793d2c368f8ccc6061fa66fd6fb2ae5307f29c352bfe01e5e5e12aa878c1403f"} Mar 18 15:39:42 crc kubenswrapper[4696]: I0318 15:39:42.137321 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"70ccac16-ae70-4ba3-9edf-76707d5643b1","Type":"ContainerStarted","Data":"0cac7636bea9f2756f3b13c630ab1e37b88bba8154da44c5e19f76236b435440"} Mar 18 15:39:42 crc kubenswrapper[4696]: I0318 15:39:42.138938 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564138-jzn9j" event={"ID":"854ec8b0-a321-4bcb-9327-96742fec3f31","Type":"ContainerStarted","Data":"5958a3f0397e5e03914c295d0124da20a9ac1aae5fb7b12a96328e1d63529ec2"} Mar 18 15:39:42 crc kubenswrapper[4696]: I0318 15:39:42.143002 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"58948084150ef7c93f40189b3e13e9a81eb29b41430a13c2aaeb822f032b7dae"} Mar 18 15:39:42 crc kubenswrapper[4696]: I0318 15:39:42.143064 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"80bbaa07f10089cc5bbdea4816dff456589c155763135bf2aad68e4c95862626"} Mar 18 15:39:42 crc kubenswrapper[4696]: I0318 15:39:42.144117 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:42 crc kubenswrapper[4696]: I0318 15:39:42.171117 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=36.171093276 podStartE2EDuration="36.171093276s" podCreationTimestamp="2026-03-18 15:39:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:39:42.1658016 +0000 UTC m=+225.171975806" watchObservedRunningTime="2026-03-18 15:39:42.171093276 +0000 UTC m=+225.177267482" Mar 18 15:39:42 crc kubenswrapper[4696]: I0318 15:39:42.186377 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:39:42 crc kubenswrapper[4696]: I0318 15:39:42.186535 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:39:42 crc kubenswrapper[4696]: I0318 15:39:42.187423 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"12f6eaa2186a5a65573821441fce55b15269975916912989786a79775570d032"} Mar 18 15:39:42 crc kubenswrapper[4696]: I0318 15:39:42.187495 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c4668dcc1fb3ca255921d5599ea54a8e9b5feb416273b849c46be9deee87ac6f"} Mar 18 15:39:42 crc kubenswrapper[4696]: I0318 15:39:42.193493 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" event={"ID":"9fc56b90-3630-479f-a9f0-87ed9c0de838","Type":"ContainerStarted","Data":"29b22fd6f7c00cb9596e935db1d32b25c9c4c5aa53f64365fceb4ba90c9ba959"} Mar 18 15:39:42 crc kubenswrapper[4696]: E0318 15:39:42.196762 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-ldd6n" podUID="459c4b74-f710-4b3a-b053-8b0326b87cb5" Mar 18 15:39:42 crc kubenswrapper[4696]: E0318 15:39:42.198679 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-q82hl" podUID="cfaa1769-2588-440e-9757-ded58dcb0ac3" Mar 18 15:39:42 crc kubenswrapper[4696]: I0318 15:39:42.228448 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564138-jzn9j" podStartSLOduration=56.921421801 podStartE2EDuration="1m42.2284251s" podCreationTimestamp="2026-03-18 15:38:00 +0000 UTC" firstStartedPulling="2026-03-18 15:38:56.004672972 +0000 UTC m=+179.010847188" lastFinishedPulling="2026-03-18 15:39:41.311676281 +0000 UTC m=+224.317850487" observedRunningTime="2026-03-18 15:39:42.212666195 +0000 UTC m=+225.218840401" watchObservedRunningTime="2026-03-18 15:39:42.2284251 +0000 UTC m=+225.234599306" Mar 18 15:39:42 crc kubenswrapper[4696]: I0318 15:39:42.382362 4696 csr.go:261] certificate signing request csr-nmpxs is approved, waiting to be issued Mar 18 15:39:42 crc kubenswrapper[4696]: I0318 15:39:42.388607 4696 csr.go:257] certificate signing request csr-nmpxs is issued Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.208008 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"70ccac16-ae70-4ba3-9edf-76707d5643b1","Type":"ContainerStarted","Data":"9f7017c0d7d72c03a3ee62e097c778897f75097c225aacd88493e25e432454bc"} Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.210272 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" event={"ID":"2c47ee76-0e0b-42f5-b824-be9c76bffd78","Type":"ContainerStarted","Data":"989ae4d79cb72a25b75d2eaec96f5b62f0018896d60ff60ab31583a2cb933993"} Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.210458 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" podUID="2c47ee76-0e0b-42f5-b824-be9c76bffd78" containerName="controller-manager" containerID="cri-o://989ae4d79cb72a25b75d2eaec96f5b62f0018896d60ff60ab31583a2cb933993" gracePeriod=30 Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.210617 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.222653 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" podUID="9fc56b90-3630-479f-a9f0-87ed9c0de838" containerName="route-controller-manager" containerID="cri-o://219c65ce620e7788c372036acf9d667613add09dad7f452f14915cb91f948a53" gracePeriod=30 Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.222954 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.223133 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.223155 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" event={"ID":"9fc56b90-3630-479f-a9f0-87ed9c0de838","Type":"ContainerStarted","Data":"219c65ce620e7788c372036acf9d667613add09dad7f452f14915cb91f948a53"} Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.238499 4696 generic.go:334] "Generic (PLEG): container finished" podID="80160993-15c5-4eea-ac72-2094fe935ac1" containerID="251fdb2378bb64a75ba20694ca9848b8c1927d4a67ff2b1b35dd5b6adb7c60ce" exitCode=0 Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.238630 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p846" event={"ID":"80160993-15c5-4eea-ac72-2094fe935ac1","Type":"ContainerDied","Data":"251fdb2378bb64a75ba20694ca9848b8c1927d4a67ff2b1b35dd5b6adb7c60ce"} Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.247390 4696 patch_prober.go:28] interesting pod/route-controller-manager-945745784-2j84f container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": read tcp 10.217.0.2:47388->10.217.0.58:8443: read: connection reset by peer" start-of-body= Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.247767 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" podUID="9fc56b90-3630-479f-a9f0-87ed9c0de838" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": read tcp 10.217.0.2:47388->10.217.0.58:8443: read: connection reset by peer" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.248452 4696 patch_prober.go:28] interesting pod/route-controller-manager-945745784-2j84f container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.248492 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" podUID="9fc56b90-3630-479f-a9f0-87ed9c0de838" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.260710 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" podStartSLOduration=28.26068246 podStartE2EDuration="28.26068246s" podCreationTimestamp="2026-03-18 15:39:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:39:43.255649321 +0000 UTC m=+226.261823567" watchObservedRunningTime="2026-03-18 15:39:43.26068246 +0000 UTC m=+226.266856676" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.262253 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=4.26223765 podStartE2EDuration="4.26223765s" podCreationTimestamp="2026-03-18 15:39:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:39:43.235463702 +0000 UTC m=+226.241637908" watchObservedRunningTime="2026-03-18 15:39:43.26223765 +0000 UTC m=+226.268411876" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.281930 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" podStartSLOduration=28.281902106 podStartE2EDuration="28.281902106s" podCreationTimestamp="2026-03-18 15:39:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:39:43.279186916 +0000 UTC m=+226.285361122" watchObservedRunningTime="2026-03-18 15:39:43.281902106 +0000 UTC m=+226.288076312" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.390114 4696 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-02 17:13:07.241701578 +0000 UTC Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.390163 4696 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6961h33m23.851542209s for next certificate rotation Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.667021 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.681205 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9fc56b90-3630-479f-a9f0-87ed9c0de838-client-ca\") pod \"9fc56b90-3630-479f-a9f0-87ed9c0de838\" (UID: \"9fc56b90-3630-479f-a9f0-87ed9c0de838\") " Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.681272 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtjvl\" (UniqueName: \"kubernetes.io/projected/9fc56b90-3630-479f-a9f0-87ed9c0de838-kube-api-access-dtjvl\") pod \"9fc56b90-3630-479f-a9f0-87ed9c0de838\" (UID: \"9fc56b90-3630-479f-a9f0-87ed9c0de838\") " Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.682427 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fc56b90-3630-479f-a9f0-87ed9c0de838-client-ca" (OuterVolumeSpecName: "client-ca") pod "9fc56b90-3630-479f-a9f0-87ed9c0de838" (UID: "9fc56b90-3630-479f-a9f0-87ed9c0de838"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.683564 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fc56b90-3630-479f-a9f0-87ed9c0de838-config\") pod \"9fc56b90-3630-479f-a9f0-87ed9c0de838\" (UID: \"9fc56b90-3630-479f-a9f0-87ed9c0de838\") " Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.683712 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fc56b90-3630-479f-a9f0-87ed9c0de838-serving-cert\") pod \"9fc56b90-3630-479f-a9f0-87ed9c0de838\" (UID: \"9fc56b90-3630-479f-a9f0-87ed9c0de838\") " Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.684051 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fc56b90-3630-479f-a9f0-87ed9c0de838-config" (OuterVolumeSpecName: "config") pod "9fc56b90-3630-479f-a9f0-87ed9c0de838" (UID: "9fc56b90-3630-479f-a9f0-87ed9c0de838"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.694206 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fc56b90-3630-479f-a9f0-87ed9c0de838-kube-api-access-dtjvl" (OuterVolumeSpecName: "kube-api-access-dtjvl") pod "9fc56b90-3630-479f-a9f0-87ed9c0de838" (UID: "9fc56b90-3630-479f-a9f0-87ed9c0de838"). InnerVolumeSpecName "kube-api-access-dtjvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.694364 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fc56b90-3630-479f-a9f0-87ed9c0de838-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9fc56b90-3630-479f-a9f0-87ed9c0de838" (UID: "9fc56b90-3630-479f-a9f0-87ed9c0de838"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.698585 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs"] Mar 18 15:39:43 crc kubenswrapper[4696]: E0318 15:39:43.698942 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc56b90-3630-479f-a9f0-87ed9c0de838" containerName="route-controller-manager" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.698963 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc56b90-3630-479f-a9f0-87ed9c0de838" containerName="route-controller-manager" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.699120 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fc56b90-3630-479f-a9f0-87ed9c0de838" containerName="route-controller-manager" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.700874 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.709580 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs"] Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.784787 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee54946-64cb-4fe5-a707-3c4625ec200d-config\") pod \"route-controller-manager-cb96cd574-cxrhs\" (UID: \"fee54946-64cb-4fe5-a707-3c4625ec200d\") " pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.784830 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fee54946-64cb-4fe5-a707-3c4625ec200d-serving-cert\") pod \"route-controller-manager-cb96cd574-cxrhs\" (UID: \"fee54946-64cb-4fe5-a707-3c4625ec200d\") " pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.784868 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg895\" (UniqueName: \"kubernetes.io/projected/fee54946-64cb-4fe5-a707-3c4625ec200d-kube-api-access-dg895\") pod \"route-controller-manager-cb96cd574-cxrhs\" (UID: \"fee54946-64cb-4fe5-a707-3c4625ec200d\") " pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.784912 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fee54946-64cb-4fe5-a707-3c4625ec200d-client-ca\") pod \"route-controller-manager-cb96cd574-cxrhs\" (UID: \"fee54946-64cb-4fe5-a707-3c4625ec200d\") " pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.784966 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fc56b90-3630-479f-a9f0-87ed9c0de838-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.784977 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fc56b90-3630-479f-a9f0-87ed9c0de838-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.784985 4696 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9fc56b90-3630-479f-a9f0-87ed9c0de838-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.784996 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtjvl\" (UniqueName: \"kubernetes.io/projected/9fc56b90-3630-479f-a9f0-87ed9c0de838-kube-api-access-dtjvl\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.886043 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg895\" (UniqueName: \"kubernetes.io/projected/fee54946-64cb-4fe5-a707-3c4625ec200d-kube-api-access-dg895\") pod \"route-controller-manager-cb96cd574-cxrhs\" (UID: \"fee54946-64cb-4fe5-a707-3c4625ec200d\") " pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.886121 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fee54946-64cb-4fe5-a707-3c4625ec200d-client-ca\") pod \"route-controller-manager-cb96cd574-cxrhs\" (UID: \"fee54946-64cb-4fe5-a707-3c4625ec200d\") " pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.886169 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee54946-64cb-4fe5-a707-3c4625ec200d-config\") pod \"route-controller-manager-cb96cd574-cxrhs\" (UID: \"fee54946-64cb-4fe5-a707-3c4625ec200d\") " pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.886187 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fee54946-64cb-4fe5-a707-3c4625ec200d-serving-cert\") pod \"route-controller-manager-cb96cd574-cxrhs\" (UID: \"fee54946-64cb-4fe5-a707-3c4625ec200d\") " pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.887568 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fee54946-64cb-4fe5-a707-3c4625ec200d-client-ca\") pod \"route-controller-manager-cb96cd574-cxrhs\" (UID: \"fee54946-64cb-4fe5-a707-3c4625ec200d\") " pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.887889 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee54946-64cb-4fe5-a707-3c4625ec200d-config\") pod \"route-controller-manager-cb96cd574-cxrhs\" (UID: \"fee54946-64cb-4fe5-a707-3c4625ec200d\") " pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.891186 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fee54946-64cb-4fe5-a707-3c4625ec200d-serving-cert\") pod \"route-controller-manager-cb96cd574-cxrhs\" (UID: \"fee54946-64cb-4fe5-a707-3c4625ec200d\") " pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" Mar 18 15:39:43 crc kubenswrapper[4696]: I0318 15:39:43.905649 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg895\" (UniqueName: \"kubernetes.io/projected/fee54946-64cb-4fe5-a707-3c4625ec200d-kube-api-access-dg895\") pod \"route-controller-manager-cb96cd574-cxrhs\" (UID: \"fee54946-64cb-4fe5-a707-3c4625ec200d\") " pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" Mar 18 15:39:44 crc kubenswrapper[4696]: I0318 15:39:44.040793 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" Mar 18 15:39:44 crc kubenswrapper[4696]: I0318 15:39:44.246258 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs"] Mar 18 15:39:44 crc kubenswrapper[4696]: I0318 15:39:44.252421 4696 generic.go:334] "Generic (PLEG): container finished" podID="816ad4e6-f60f-4487-a12b-529fc685d209" containerID="1ffcdbea3b55d8710bc0618939d718fef0c48a1030b9a94568b0f38accdd5f3c" exitCode=0 Mar 18 15:39:44 crc kubenswrapper[4696]: I0318 15:39:44.252589 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"816ad4e6-f60f-4487-a12b-529fc685d209","Type":"ContainerDied","Data":"1ffcdbea3b55d8710bc0618939d718fef0c48a1030b9a94568b0f38accdd5f3c"} Mar 18 15:39:44 crc kubenswrapper[4696]: I0318 15:39:44.254500 4696 generic.go:334] "Generic (PLEG): container finished" podID="854ec8b0-a321-4bcb-9327-96742fec3f31" containerID="5958a3f0397e5e03914c295d0124da20a9ac1aae5fb7b12a96328e1d63529ec2" exitCode=0 Mar 18 15:39:44 crc kubenswrapper[4696]: I0318 15:39:44.254620 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564138-jzn9j" event={"ID":"854ec8b0-a321-4bcb-9327-96742fec3f31","Type":"ContainerDied","Data":"5958a3f0397e5e03914c295d0124da20a9ac1aae5fb7b12a96328e1d63529ec2"} Mar 18 15:39:44 crc kubenswrapper[4696]: I0318 15:39:44.257389 4696 generic.go:334] "Generic (PLEG): container finished" podID="2c47ee76-0e0b-42f5-b824-be9c76bffd78" containerID="989ae4d79cb72a25b75d2eaec96f5b62f0018896d60ff60ab31583a2cb933993" exitCode=0 Mar 18 15:39:44 crc kubenswrapper[4696]: I0318 15:39:44.257623 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" event={"ID":"2c47ee76-0e0b-42f5-b824-be9c76bffd78","Type":"ContainerDied","Data":"989ae4d79cb72a25b75d2eaec96f5b62f0018896d60ff60ab31583a2cb933993"} Mar 18 15:39:44 crc kubenswrapper[4696]: I0318 15:39:44.259187 4696 generic.go:334] "Generic (PLEG): container finished" podID="9fc56b90-3630-479f-a9f0-87ed9c0de838" containerID="219c65ce620e7788c372036acf9d667613add09dad7f452f14915cb91f948a53" exitCode=0 Mar 18 15:39:44 crc kubenswrapper[4696]: I0318 15:39:44.259281 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" Mar 18 15:39:44 crc kubenswrapper[4696]: I0318 15:39:44.259612 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" event={"ID":"9fc56b90-3630-479f-a9f0-87ed9c0de838","Type":"ContainerDied","Data":"219c65ce620e7788c372036acf9d667613add09dad7f452f14915cb91f948a53"} Mar 18 15:39:44 crc kubenswrapper[4696]: I0318 15:39:44.259649 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-945745784-2j84f" event={"ID":"9fc56b90-3630-479f-a9f0-87ed9c0de838","Type":"ContainerDied","Data":"29b22fd6f7c00cb9596e935db1d32b25c9c4c5aa53f64365fceb4ba90c9ba959"} Mar 18 15:39:44 crc kubenswrapper[4696]: I0318 15:39:44.259676 4696 scope.go:117] "RemoveContainer" containerID="219c65ce620e7788c372036acf9d667613add09dad7f452f14915cb91f948a53" Mar 18 15:39:44 crc kubenswrapper[4696]: W0318 15:39:44.279154 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfee54946_64cb_4fe5_a707_3c4625ec200d.slice/crio-8b14c4f1854f98bf1a8d29e3065d31ee2a83ca987622bc37010c6b4a0d55f618 WatchSource:0}: Error finding container 8b14c4f1854f98bf1a8d29e3065d31ee2a83ca987622bc37010c6b4a0d55f618: Status 404 returned error can't find the container with id 8b14c4f1854f98bf1a8d29e3065d31ee2a83ca987622bc37010c6b4a0d55f618 Mar 18 15:39:44 crc kubenswrapper[4696]: I0318 15:39:44.279639 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c7cfa7a5-467e-40ae-a375-060b1435acaf","Type":"ContainerStarted","Data":"43f1512f42abeb4f5634d9692366bfb28ceb92f6e83fc9a2e093a82403c51143"} Mar 18 15:39:44 crc kubenswrapper[4696]: I0318 15:39:44.301896 4696 scope.go:117] "RemoveContainer" containerID="219c65ce620e7788c372036acf9d667613add09dad7f452f14915cb91f948a53" Mar 18 15:39:44 crc kubenswrapper[4696]: E0318 15:39:44.304673 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"219c65ce620e7788c372036acf9d667613add09dad7f452f14915cb91f948a53\": container with ID starting with 219c65ce620e7788c372036acf9d667613add09dad7f452f14915cb91f948a53 not found: ID does not exist" containerID="219c65ce620e7788c372036acf9d667613add09dad7f452f14915cb91f948a53" Mar 18 15:39:44 crc kubenswrapper[4696]: I0318 15:39:44.304991 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"219c65ce620e7788c372036acf9d667613add09dad7f452f14915cb91f948a53"} err="failed to get container status \"219c65ce620e7788c372036acf9d667613add09dad7f452f14915cb91f948a53\": rpc error: code = NotFound desc = could not find container \"219c65ce620e7788c372036acf9d667613add09dad7f452f14915cb91f948a53\": container with ID starting with 219c65ce620e7788c372036acf9d667613add09dad7f452f14915cb91f948a53 not found: ID does not exist" Mar 18 15:39:44 crc kubenswrapper[4696]: I0318 15:39:44.343955 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=10.343933642 podStartE2EDuration="10.343933642s" podCreationTimestamp="2026-03-18 15:39:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:39:44.326836042 +0000 UTC m=+227.333010268" watchObservedRunningTime="2026-03-18 15:39:44.343933642 +0000 UTC m=+227.350107858" Mar 18 15:39:44 crc kubenswrapper[4696]: I0318 15:39:44.344916 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-945745784-2j84f"] Mar 18 15:39:44 crc kubenswrapper[4696]: I0318 15:39:44.349567 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-945745784-2j84f"] Mar 18 15:39:44 crc kubenswrapper[4696]: I0318 15:39:44.391003 4696 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-31 16:02:02.582671956 +0000 UTC Mar 18 15:39:44 crc kubenswrapper[4696]: I0318 15:39:44.391038 4696 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6912h22m18.191636503s for next certificate rotation Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.287767 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cp998" event={"ID":"0c1d314f-dde9-4056-964f-4eb911306306","Type":"ContainerStarted","Data":"1071d90637ff828e097cb2c4d40609cca6865653f7379f35c25d5395c293a65f"} Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.291738 4696 generic.go:334] "Generic (PLEG): container finished" podID="c7cfa7a5-467e-40ae-a375-060b1435acaf" containerID="43f1512f42abeb4f5634d9692366bfb28ceb92f6e83fc9a2e093a82403c51143" exitCode=0 Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.291905 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c7cfa7a5-467e-40ae-a375-060b1435acaf","Type":"ContainerDied","Data":"43f1512f42abeb4f5634d9692366bfb28ceb92f6e83fc9a2e093a82403c51143"} Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.296767 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" event={"ID":"fee54946-64cb-4fe5-a707-3c4625ec200d","Type":"ContainerStarted","Data":"8b14c4f1854f98bf1a8d29e3065d31ee2a83ca987622bc37010c6b4a0d55f618"} Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.592023 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564138-jzn9j" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.624663 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fc56b90-3630-479f-a9f0-87ed9c0de838" path="/var/lib/kubelet/pods/9fc56b90-3630-479f-a9f0-87ed9c0de838/volumes" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.629547 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7ftd\" (UniqueName: \"kubernetes.io/projected/854ec8b0-a321-4bcb-9327-96742fec3f31-kube-api-access-p7ftd\") pod \"854ec8b0-a321-4bcb-9327-96742fec3f31\" (UID: \"854ec8b0-a321-4bcb-9327-96742fec3f31\") " Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.638715 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/854ec8b0-a321-4bcb-9327-96742fec3f31-kube-api-access-p7ftd" (OuterVolumeSpecName: "kube-api-access-p7ftd") pod "854ec8b0-a321-4bcb-9327-96742fec3f31" (UID: "854ec8b0-a321-4bcb-9327-96742fec3f31"). InnerVolumeSpecName "kube-api-access-p7ftd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.701652 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.723670 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.730385 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/816ad4e6-f60f-4487-a12b-529fc685d209-kube-api-access\") pod \"816ad4e6-f60f-4487-a12b-529fc685d209\" (UID: \"816ad4e6-f60f-4487-a12b-529fc685d209\") " Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.730531 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/816ad4e6-f60f-4487-a12b-529fc685d209-kubelet-dir\") pod \"816ad4e6-f60f-4487-a12b-529fc685d209\" (UID: \"816ad4e6-f60f-4487-a12b-529fc685d209\") " Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.730634 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/816ad4e6-f60f-4487-a12b-529fc685d209-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "816ad4e6-f60f-4487-a12b-529fc685d209" (UID: "816ad4e6-f60f-4487-a12b-529fc685d209"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.730876 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7ftd\" (UniqueName: \"kubernetes.io/projected/854ec8b0-a321-4bcb-9327-96742fec3f31-kube-api-access-p7ftd\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.730893 4696 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/816ad4e6-f60f-4487-a12b-529fc685d209-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.739716 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/816ad4e6-f60f-4487-a12b-529fc685d209-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "816ad4e6-f60f-4487-a12b-529fc685d209" (UID: "816ad4e6-f60f-4487-a12b-529fc685d209"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.831906 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-767b455df8-rv4nr"] Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.832044 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm5vq\" (UniqueName: \"kubernetes.io/projected/2c47ee76-0e0b-42f5-b824-be9c76bffd78-kube-api-access-vm5vq\") pod \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\" (UID: \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\") " Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.832142 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c47ee76-0e0b-42f5-b824-be9c76bffd78-serving-cert\") pod \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\" (UID: \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\") " Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.832172 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c47ee76-0e0b-42f5-b824-be9c76bffd78-client-ca\") pod \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\" (UID: \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\") " Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.832213 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c47ee76-0e0b-42f5-b824-be9c76bffd78-proxy-ca-bundles\") pod \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\" (UID: \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\") " Mar 18 15:39:45 crc kubenswrapper[4696]: E0318 15:39:45.832232 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="854ec8b0-a321-4bcb-9327-96742fec3f31" containerName="oc" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.832254 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="854ec8b0-a321-4bcb-9327-96742fec3f31" containerName="oc" Mar 18 15:39:45 crc kubenswrapper[4696]: E0318 15:39:45.832273 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c47ee76-0e0b-42f5-b824-be9c76bffd78" containerName="controller-manager" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.832281 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c47ee76-0e0b-42f5-b824-be9c76bffd78" containerName="controller-manager" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.832295 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c47ee76-0e0b-42f5-b824-be9c76bffd78-config\") pod \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\" (UID: \"2c47ee76-0e0b-42f5-b824-be9c76bffd78\") " Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.832513 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/816ad4e6-f60f-4487-a12b-529fc685d209-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:45 crc kubenswrapper[4696]: E0318 15:39:45.832300 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="816ad4e6-f60f-4487-a12b-529fc685d209" containerName="pruner" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.832588 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="816ad4e6-f60f-4487-a12b-529fc685d209" containerName="pruner" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.832907 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="816ad4e6-f60f-4487-a12b-529fc685d209" containerName="pruner" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.832926 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="854ec8b0-a321-4bcb-9327-96742fec3f31" containerName="oc" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.832940 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c47ee76-0e0b-42f5-b824-be9c76bffd78" containerName="controller-manager" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.833143 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c47ee76-0e0b-42f5-b824-be9c76bffd78-client-ca" (OuterVolumeSpecName: "client-ca") pod "2c47ee76-0e0b-42f5-b824-be9c76bffd78" (UID: "2c47ee76-0e0b-42f5-b824-be9c76bffd78"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.833206 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c47ee76-0e0b-42f5-b824-be9c76bffd78-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2c47ee76-0e0b-42f5-b824-be9c76bffd78" (UID: "2c47ee76-0e0b-42f5-b824-be9c76bffd78"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.833419 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.835980 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c47ee76-0e0b-42f5-b824-be9c76bffd78-kube-api-access-vm5vq" (OuterVolumeSpecName: "kube-api-access-vm5vq") pod "2c47ee76-0e0b-42f5-b824-be9c76bffd78" (UID: "2c47ee76-0e0b-42f5-b824-be9c76bffd78"). InnerVolumeSpecName "kube-api-access-vm5vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.838461 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c47ee76-0e0b-42f5-b824-be9c76bffd78-config" (OuterVolumeSpecName: "config") pod "2c47ee76-0e0b-42f5-b824-be9c76bffd78" (UID: "2c47ee76-0e0b-42f5-b824-be9c76bffd78"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.839253 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c47ee76-0e0b-42f5-b824-be9c76bffd78-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2c47ee76-0e0b-42f5-b824-be9c76bffd78" (UID: "2c47ee76-0e0b-42f5-b824-be9c76bffd78"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.846954 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-767b455df8-rv4nr"] Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.934049 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac189e2d-a0f3-4f8c-ac45-94200964fdba-client-ca\") pod \"controller-manager-767b455df8-rv4nr\" (UID: \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\") " pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.934113 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac189e2d-a0f3-4f8c-ac45-94200964fdba-config\") pod \"controller-manager-767b455df8-rv4nr\" (UID: \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\") " pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.934153 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac189e2d-a0f3-4f8c-ac45-94200964fdba-proxy-ca-bundles\") pod \"controller-manager-767b455df8-rv4nr\" (UID: \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\") " pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.934459 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcbkf\" (UniqueName: \"kubernetes.io/projected/ac189e2d-a0f3-4f8c-ac45-94200964fdba-kube-api-access-vcbkf\") pod \"controller-manager-767b455df8-rv4nr\" (UID: \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\") " pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.934563 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac189e2d-a0f3-4f8c-ac45-94200964fdba-serving-cert\") pod \"controller-manager-767b455df8-rv4nr\" (UID: \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\") " pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.934714 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c47ee76-0e0b-42f5-b824-be9c76bffd78-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.934737 4696 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c47ee76-0e0b-42f5-b824-be9c76bffd78-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.934750 4696 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c47ee76-0e0b-42f5-b824-be9c76bffd78-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.934767 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c47ee76-0e0b-42f5-b824-be9c76bffd78-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:45 crc kubenswrapper[4696]: I0318 15:39:45.934780 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm5vq\" (UniqueName: \"kubernetes.io/projected/2c47ee76-0e0b-42f5-b824-be9c76bffd78-kube-api-access-vm5vq\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.036254 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac189e2d-a0f3-4f8c-ac45-94200964fdba-serving-cert\") pod \"controller-manager-767b455df8-rv4nr\" (UID: \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\") " pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.036362 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac189e2d-a0f3-4f8c-ac45-94200964fdba-client-ca\") pod \"controller-manager-767b455df8-rv4nr\" (UID: \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\") " pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.036408 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac189e2d-a0f3-4f8c-ac45-94200964fdba-config\") pod \"controller-manager-767b455df8-rv4nr\" (UID: \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\") " pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.036460 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac189e2d-a0f3-4f8c-ac45-94200964fdba-proxy-ca-bundles\") pod \"controller-manager-767b455df8-rv4nr\" (UID: \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\") " pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.036489 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcbkf\" (UniqueName: \"kubernetes.io/projected/ac189e2d-a0f3-4f8c-ac45-94200964fdba-kube-api-access-vcbkf\") pod \"controller-manager-767b455df8-rv4nr\" (UID: \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\") " pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.038853 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac189e2d-a0f3-4f8c-ac45-94200964fdba-proxy-ca-bundles\") pod \"controller-manager-767b455df8-rv4nr\" (UID: \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\") " pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.038850 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac189e2d-a0f3-4f8c-ac45-94200964fdba-client-ca\") pod \"controller-manager-767b455df8-rv4nr\" (UID: \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\") " pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.040121 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac189e2d-a0f3-4f8c-ac45-94200964fdba-config\") pod \"controller-manager-767b455df8-rv4nr\" (UID: \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\") " pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.041647 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac189e2d-a0f3-4f8c-ac45-94200964fdba-serving-cert\") pod \"controller-manager-767b455df8-rv4nr\" (UID: \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\") " pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.056572 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcbkf\" (UniqueName: \"kubernetes.io/projected/ac189e2d-a0f3-4f8c-ac45-94200964fdba-kube-api-access-vcbkf\") pod \"controller-manager-767b455df8-rv4nr\" (UID: \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\") " pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.149632 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.315271 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"816ad4e6-f60f-4487-a12b-529fc685d209","Type":"ContainerDied","Data":"e374743d3a38163186f173367fbc3198ae85c26d279c061fe55b712f335ac75b"} Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.315939 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e374743d3a38163186f173367fbc3198ae85c26d279c061fe55b712f335ac75b" Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.316032 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.321234 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564138-jzn9j" event={"ID":"854ec8b0-a321-4bcb-9327-96742fec3f31","Type":"ContainerDied","Data":"59b2ebb1d33687b3a8829d25cccde1f6c5a6ed3e1f29c40427879c8a33aaeae3"} Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.321264 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564138-jzn9j" Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.321278 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59b2ebb1d33687b3a8829d25cccde1f6c5a6ed3e1f29c40427879c8a33aaeae3" Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.323269 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" event={"ID":"fee54946-64cb-4fe5-a707-3c4625ec200d","Type":"ContainerStarted","Data":"44e77e8ffd9bed41a469cc83a5ab3c82871beb85f393bd92a7cd45553315aa87"} Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.326379 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.326420 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6585cd7b87-prc4l" event={"ID":"2c47ee76-0e0b-42f5-b824-be9c76bffd78","Type":"ContainerDied","Data":"d27c0d00bc292aaa8052a94c5f99f624648527dcc730ba2325f0adae5c436310"} Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.326462 4696 scope.go:117] "RemoveContainer" containerID="989ae4d79cb72a25b75d2eaec96f5b62f0018896d60ff60ab31583a2cb933993" Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.349298 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-767b455df8-rv4nr"] Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.378352 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cp998" podStartSLOduration=5.684563769 podStartE2EDuration="44.378330129s" podCreationTimestamp="2026-03-18 15:39:02 +0000 UTC" firstStartedPulling="2026-03-18 15:39:04.787942343 +0000 UTC m=+187.794116549" lastFinishedPulling="2026-03-18 15:39:43.481708703 +0000 UTC m=+226.487882909" observedRunningTime="2026-03-18 15:39:46.371980235 +0000 UTC m=+229.378154441" watchObservedRunningTime="2026-03-18 15:39:46.378330129 +0000 UTC m=+229.384504345" Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.399633 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6585cd7b87-prc4l"] Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.401894 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6585cd7b87-prc4l"] Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.602442 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.647907 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7cfa7a5-467e-40ae-a375-060b1435acaf-kubelet-dir\") pod \"c7cfa7a5-467e-40ae-a375-060b1435acaf\" (UID: \"c7cfa7a5-467e-40ae-a375-060b1435acaf\") " Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.647993 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7cfa7a5-467e-40ae-a375-060b1435acaf-kube-api-access\") pod \"c7cfa7a5-467e-40ae-a375-060b1435acaf\" (UID: \"c7cfa7a5-467e-40ae-a375-060b1435acaf\") " Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.648045 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7cfa7a5-467e-40ae-a375-060b1435acaf-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c7cfa7a5-467e-40ae-a375-060b1435acaf" (UID: "c7cfa7a5-467e-40ae-a375-060b1435acaf"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.649258 4696 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7cfa7a5-467e-40ae-a375-060b1435acaf-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.654742 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7cfa7a5-467e-40ae-a375-060b1435acaf-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c7cfa7a5-467e-40ae-a375-060b1435acaf" (UID: "c7cfa7a5-467e-40ae-a375-060b1435acaf"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:39:46 crc kubenswrapper[4696]: I0318 15:39:46.750479 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7cfa7a5-467e-40ae-a375-060b1435acaf-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:47 crc kubenswrapper[4696]: I0318 15:39:47.339229 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c7cfa7a5-467e-40ae-a375-060b1435acaf","Type":"ContainerDied","Data":"574056382a1f4510332330711bf9da1aa7ab3642060607c648610cb723c108aa"} Mar 18 15:39:47 crc kubenswrapper[4696]: I0318 15:39:47.339292 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="574056382a1f4510332330711bf9da1aa7ab3642060607c648610cb723c108aa" Mar 18 15:39:47 crc kubenswrapper[4696]: I0318 15:39:47.339379 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:39:47 crc kubenswrapper[4696]: I0318 15:39:47.347610 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" event={"ID":"ac189e2d-a0f3-4f8c-ac45-94200964fdba","Type":"ContainerStarted","Data":"1fe86e01c1cbce64b6ed4a939be3703dc5e91d5bdaa8c08bb577f56c20d0e6c9"} Mar 18 15:39:47 crc kubenswrapper[4696]: I0318 15:39:47.347701 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" Mar 18 15:39:47 crc kubenswrapper[4696]: I0318 15:39:47.347715 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" event={"ID":"ac189e2d-a0f3-4f8c-ac45-94200964fdba","Type":"ContainerStarted","Data":"2d7429698fbbfbbd02f7842dda7b16f04d929f2e16f46bd4f33d1f416997667a"} Mar 18 15:39:47 crc kubenswrapper[4696]: I0318 15:39:47.355598 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" Mar 18 15:39:47 crc kubenswrapper[4696]: I0318 15:39:47.373716 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" podStartSLOduration=12.373681029 podStartE2EDuration="12.373681029s" podCreationTimestamp="2026-03-18 15:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:39:47.372275743 +0000 UTC m=+230.378449949" watchObservedRunningTime="2026-03-18 15:39:47.373681029 +0000 UTC m=+230.379855235" Mar 18 15:39:47 crc kubenswrapper[4696]: I0318 15:39:47.605211 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c47ee76-0e0b-42f5-b824-be9c76bffd78" path="/var/lib/kubelet/pods/2c47ee76-0e0b-42f5-b824-be9c76bffd78/volumes" Mar 18 15:39:48 crc kubenswrapper[4696]: I0318 15:39:48.352272 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" Mar 18 15:39:48 crc kubenswrapper[4696]: I0318 15:39:48.357285 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" Mar 18 15:39:48 crc kubenswrapper[4696]: I0318 15:39:48.368503 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" podStartSLOduration=13.368482207 podStartE2EDuration="13.368482207s" podCreationTimestamp="2026-03-18 15:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:39:48.367879661 +0000 UTC m=+231.374053867" watchObservedRunningTime="2026-03-18 15:39:48.368482207 +0000 UTC m=+231.374656413" Mar 18 15:39:53 crc kubenswrapper[4696]: I0318 15:39:53.320905 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cp998" Mar 18 15:39:53 crc kubenswrapper[4696]: I0318 15:39:53.321500 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cp998" Mar 18 15:39:54 crc kubenswrapper[4696]: I0318 15:39:54.526591 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cp998" Mar 18 15:39:54 crc kubenswrapper[4696]: I0318 15:39:54.682983 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cp998" Mar 18 15:39:54 crc kubenswrapper[4696]: I0318 15:39:54.828882 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cp998"] Mar 18 15:39:55 crc kubenswrapper[4696]: I0318 15:39:55.003643 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-767b455df8-rv4nr"] Mar 18 15:39:55 crc kubenswrapper[4696]: I0318 15:39:55.004150 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" podUID="ac189e2d-a0f3-4f8c-ac45-94200964fdba" containerName="controller-manager" containerID="cri-o://1fe86e01c1cbce64b6ed4a939be3703dc5e91d5bdaa8c08bb577f56c20d0e6c9" gracePeriod=30 Mar 18 15:39:55 crc kubenswrapper[4696]: I0318 15:39:55.026851 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs"] Mar 18 15:39:55 crc kubenswrapper[4696]: I0318 15:39:55.027046 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" podUID="fee54946-64cb-4fe5-a707-3c4625ec200d" containerName="route-controller-manager" containerID="cri-o://44e77e8ffd9bed41a469cc83a5ab3c82871beb85f393bd92a7cd45553315aa87" gracePeriod=30 Mar 18 15:39:55 crc kubenswrapper[4696]: I0318 15:39:55.402549 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p846" event={"ID":"80160993-15c5-4eea-ac72-2094fe935ac1","Type":"ContainerStarted","Data":"944127e4043f6ef58f8f14c7dbfda1111f31e8774ec4c3ade3f8cd7edf82fb28"} Mar 18 15:39:55 crc kubenswrapper[4696]: I0318 15:39:55.404039 4696 generic.go:334] "Generic (PLEG): container finished" podID="fee54946-64cb-4fe5-a707-3c4625ec200d" containerID="44e77e8ffd9bed41a469cc83a5ab3c82871beb85f393bd92a7cd45553315aa87" exitCode=0 Mar 18 15:39:55 crc kubenswrapper[4696]: I0318 15:39:55.404369 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" event={"ID":"fee54946-64cb-4fe5-a707-3c4625ec200d","Type":"ContainerDied","Data":"44e77e8ffd9bed41a469cc83a5ab3c82871beb85f393bd92a7cd45553315aa87"} Mar 18 15:39:55 crc kubenswrapper[4696]: I0318 15:39:55.421061 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6p846" podStartSLOduration=12.898177609 podStartE2EDuration="52.421044635s" podCreationTimestamp="2026-03-18 15:39:03 +0000 UTC" firstStartedPulling="2026-03-18 15:39:14.089768964 +0000 UTC m=+197.095943170" lastFinishedPulling="2026-03-18 15:39:53.612636 +0000 UTC m=+236.618810196" observedRunningTime="2026-03-18 15:39:55.417916305 +0000 UTC m=+238.424090511" watchObservedRunningTime="2026-03-18 15:39:55.421044635 +0000 UTC m=+238.427218851" Mar 18 15:39:56 crc kubenswrapper[4696]: I0318 15:39:56.150734 4696 patch_prober.go:28] interesting pod/controller-manager-767b455df8-rv4nr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" start-of-body= Mar 18 15:39:56 crc kubenswrapper[4696]: I0318 15:39:56.150810 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" podUID="ac189e2d-a0f3-4f8c-ac45-94200964fdba" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" Mar 18 15:39:56 crc kubenswrapper[4696]: I0318 15:39:56.410659 4696 generic.go:334] "Generic (PLEG): container finished" podID="ac189e2d-a0f3-4f8c-ac45-94200964fdba" containerID="1fe86e01c1cbce64b6ed4a939be3703dc5e91d5bdaa8c08bb577f56c20d0e6c9" exitCode=0 Mar 18 15:39:56 crc kubenswrapper[4696]: I0318 15:39:56.410872 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cp998" podUID="0c1d314f-dde9-4056-964f-4eb911306306" containerName="registry-server" containerID="cri-o://1071d90637ff828e097cb2c4d40609cca6865653f7379f35c25d5395c293a65f" gracePeriod=2 Mar 18 15:39:56 crc kubenswrapper[4696]: I0318 15:39:56.411117 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" event={"ID":"ac189e2d-a0f3-4f8c-ac45-94200964fdba","Type":"ContainerDied","Data":"1fe86e01c1cbce64b6ed4a939be3703dc5e91d5bdaa8c08bb577f56c20d0e6c9"} Mar 18 15:39:59 crc kubenswrapper[4696]: I0318 15:39:59.429793 4696 generic.go:334] "Generic (PLEG): container finished" podID="0c1d314f-dde9-4056-964f-4eb911306306" containerID="1071d90637ff828e097cb2c4d40609cca6865653f7379f35c25d5395c293a65f" exitCode=0 Mar 18 15:39:59 crc kubenswrapper[4696]: I0318 15:39:59.429881 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cp998" event={"ID":"0c1d314f-dde9-4056-964f-4eb911306306","Type":"ContainerDied","Data":"1071d90637ff828e097cb2c4d40609cca6865653f7379f35c25d5395c293a65f"} Mar 18 15:40:00 crc kubenswrapper[4696]: I0318 15:40:00.134356 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564140-vkm8r"] Mar 18 15:40:00 crc kubenswrapper[4696]: E0318 15:40:00.134636 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7cfa7a5-467e-40ae-a375-060b1435acaf" containerName="pruner" Mar 18 15:40:00 crc kubenswrapper[4696]: I0318 15:40:00.134652 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7cfa7a5-467e-40ae-a375-060b1435acaf" containerName="pruner" Mar 18 15:40:00 crc kubenswrapper[4696]: I0318 15:40:00.134746 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7cfa7a5-467e-40ae-a375-060b1435acaf" containerName="pruner" Mar 18 15:40:00 crc kubenswrapper[4696]: I0318 15:40:00.135160 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564140-vkm8r" Mar 18 15:40:00 crc kubenswrapper[4696]: I0318 15:40:00.139683 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:40:00 crc kubenswrapper[4696]: I0318 15:40:00.140077 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:40:00 crc kubenswrapper[4696]: I0318 15:40:00.140537 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 15:40:00 crc kubenswrapper[4696]: I0318 15:40:00.140878 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxcw2\" (UniqueName: \"kubernetes.io/projected/a2528428-1e7c-49d7-8f64-d38ec08d18a7-kube-api-access-lxcw2\") pod \"auto-csr-approver-29564140-vkm8r\" (UID: \"a2528428-1e7c-49d7-8f64-d38ec08d18a7\") " pod="openshift-infra/auto-csr-approver-29564140-vkm8r" Mar 18 15:40:00 crc kubenswrapper[4696]: I0318 15:40:00.143494 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564140-vkm8r"] Mar 18 15:40:00 crc kubenswrapper[4696]: I0318 15:40:00.242088 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxcw2\" (UniqueName: \"kubernetes.io/projected/a2528428-1e7c-49d7-8f64-d38ec08d18a7-kube-api-access-lxcw2\") pod \"auto-csr-approver-29564140-vkm8r\" (UID: \"a2528428-1e7c-49d7-8f64-d38ec08d18a7\") " pod="openshift-infra/auto-csr-approver-29564140-vkm8r" Mar 18 15:40:00 crc kubenswrapper[4696]: I0318 15:40:00.261311 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxcw2\" (UniqueName: \"kubernetes.io/projected/a2528428-1e7c-49d7-8f64-d38ec08d18a7-kube-api-access-lxcw2\") pod \"auto-csr-approver-29564140-vkm8r\" (UID: \"a2528428-1e7c-49d7-8f64-d38ec08d18a7\") " pod="openshift-infra/auto-csr-approver-29564140-vkm8r" Mar 18 15:40:00 crc kubenswrapper[4696]: I0318 15:40:00.458933 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564140-vkm8r" Mar 18 15:40:03 crc kubenswrapper[4696]: E0318 15:40:03.322503 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1071d90637ff828e097cb2c4d40609cca6865653f7379f35c25d5395c293a65f is running failed: container process not found" containerID="1071d90637ff828e097cb2c4d40609cca6865653f7379f35c25d5395c293a65f" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 15:40:03 crc kubenswrapper[4696]: E0318 15:40:03.323431 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1071d90637ff828e097cb2c4d40609cca6865653f7379f35c25d5395c293a65f is running failed: container process not found" containerID="1071d90637ff828e097cb2c4d40609cca6865653f7379f35c25d5395c293a65f" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 15:40:03 crc kubenswrapper[4696]: E0318 15:40:03.323803 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1071d90637ff828e097cb2c4d40609cca6865653f7379f35c25d5395c293a65f is running failed: container process not found" containerID="1071d90637ff828e097cb2c4d40609cca6865653f7379f35c25d5395c293a65f" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 15:40:03 crc kubenswrapper[4696]: E0318 15:40:03.323882 4696 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1071d90637ff828e097cb2c4d40609cca6865653f7379f35c25d5395c293a65f is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-cp998" podUID="0c1d314f-dde9-4056-964f-4eb911306306" containerName="registry-server" Mar 18 15:40:03 crc kubenswrapper[4696]: I0318 15:40:03.808033 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6p846" Mar 18 15:40:03 crc kubenswrapper[4696]: I0318 15:40:03.808080 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6p846" Mar 18 15:40:03 crc kubenswrapper[4696]: I0318 15:40:03.854159 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6p846" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.084867 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.110272 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.120762 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z"] Mar 18 15:40:04 crc kubenswrapper[4696]: E0318 15:40:04.121032 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac189e2d-a0f3-4f8c-ac45-94200964fdba" containerName="controller-manager" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.121043 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac189e2d-a0f3-4f8c-ac45-94200964fdba" containerName="controller-manager" Mar 18 15:40:04 crc kubenswrapper[4696]: E0318 15:40:04.121055 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee54946-64cb-4fe5-a707-3c4625ec200d" containerName="route-controller-manager" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.121060 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee54946-64cb-4fe5-a707-3c4625ec200d" containerName="route-controller-manager" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.121158 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee54946-64cb-4fe5-a707-3c4625ec200d" containerName="route-controller-manager" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.121167 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac189e2d-a0f3-4f8c-ac45-94200964fdba" containerName="controller-manager" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.121550 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.130397 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z"] Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.193121 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee54946-64cb-4fe5-a707-3c4625ec200d-config\") pod \"fee54946-64cb-4fe5-a707-3c4625ec200d\" (UID: \"fee54946-64cb-4fe5-a707-3c4625ec200d\") " Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.193166 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fee54946-64cb-4fe5-a707-3c4625ec200d-client-ca\") pod \"fee54946-64cb-4fe5-a707-3c4625ec200d\" (UID: \"fee54946-64cb-4fe5-a707-3c4625ec200d\") " Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.193306 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg895\" (UniqueName: \"kubernetes.io/projected/fee54946-64cb-4fe5-a707-3c4625ec200d-kube-api-access-dg895\") pod \"fee54946-64cb-4fe5-a707-3c4625ec200d\" (UID: \"fee54946-64cb-4fe5-a707-3c4625ec200d\") " Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.193335 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fee54946-64cb-4fe5-a707-3c4625ec200d-serving-cert\") pod \"fee54946-64cb-4fe5-a707-3c4625ec200d\" (UID: \"fee54946-64cb-4fe5-a707-3c4625ec200d\") " Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.194298 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fee54946-64cb-4fe5-a707-3c4625ec200d-config" (OuterVolumeSpecName: "config") pod "fee54946-64cb-4fe5-a707-3c4625ec200d" (UID: "fee54946-64cb-4fe5-a707-3c4625ec200d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.194319 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fee54946-64cb-4fe5-a707-3c4625ec200d-client-ca" (OuterVolumeSpecName: "client-ca") pod "fee54946-64cb-4fe5-a707-3c4625ec200d" (UID: "fee54946-64cb-4fe5-a707-3c4625ec200d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.198404 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee54946-64cb-4fe5-a707-3c4625ec200d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fee54946-64cb-4fe5-a707-3c4625ec200d" (UID: "fee54946-64cb-4fe5-a707-3c4625ec200d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.198489 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fee54946-64cb-4fe5-a707-3c4625ec200d-kube-api-access-dg895" (OuterVolumeSpecName: "kube-api-access-dg895") pod "fee54946-64cb-4fe5-a707-3c4625ec200d" (UID: "fee54946-64cb-4fe5-a707-3c4625ec200d"). InnerVolumeSpecName "kube-api-access-dg895". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.294539 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac189e2d-a0f3-4f8c-ac45-94200964fdba-client-ca\") pod \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\" (UID: \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\") " Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.294608 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac189e2d-a0f3-4f8c-ac45-94200964fdba-serving-cert\") pod \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\" (UID: \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\") " Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.294678 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac189e2d-a0f3-4f8c-ac45-94200964fdba-config\") pod \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\" (UID: \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\") " Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.294711 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcbkf\" (UniqueName: \"kubernetes.io/projected/ac189e2d-a0f3-4f8c-ac45-94200964fdba-kube-api-access-vcbkf\") pod \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\" (UID: \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\") " Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.294755 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac189e2d-a0f3-4f8c-ac45-94200964fdba-proxy-ca-bundles\") pod \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\" (UID: \"ac189e2d-a0f3-4f8c-ac45-94200964fdba\") " Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.294913 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/311e6ae2-bf41-4276-b1ea-580b5d03da60-config\") pod \"route-controller-manager-668f8c44bc-tkr7z\" (UID: \"311e6ae2-bf41-4276-b1ea-580b5d03da60\") " pod="openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.294944 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/311e6ae2-bf41-4276-b1ea-580b5d03da60-serving-cert\") pod \"route-controller-manager-668f8c44bc-tkr7z\" (UID: \"311e6ae2-bf41-4276-b1ea-580b5d03da60\") " pod="openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.294965 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg86w\" (UniqueName: \"kubernetes.io/projected/311e6ae2-bf41-4276-b1ea-580b5d03da60-kube-api-access-qg86w\") pod \"route-controller-manager-668f8c44bc-tkr7z\" (UID: \"311e6ae2-bf41-4276-b1ea-580b5d03da60\") " pod="openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.294984 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/311e6ae2-bf41-4276-b1ea-580b5d03da60-client-ca\") pod \"route-controller-manager-668f8c44bc-tkr7z\" (UID: \"311e6ae2-bf41-4276-b1ea-580b5d03da60\") " pod="openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.295032 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg895\" (UniqueName: \"kubernetes.io/projected/fee54946-64cb-4fe5-a707-3c4625ec200d-kube-api-access-dg895\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.295045 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fee54946-64cb-4fe5-a707-3c4625ec200d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.295054 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee54946-64cb-4fe5-a707-3c4625ec200d-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.295062 4696 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fee54946-64cb-4fe5-a707-3c4625ec200d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.295556 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac189e2d-a0f3-4f8c-ac45-94200964fdba-client-ca" (OuterVolumeSpecName: "client-ca") pod "ac189e2d-a0f3-4f8c-ac45-94200964fdba" (UID: "ac189e2d-a0f3-4f8c-ac45-94200964fdba"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.295647 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac189e2d-a0f3-4f8c-ac45-94200964fdba-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ac189e2d-a0f3-4f8c-ac45-94200964fdba" (UID: "ac189e2d-a0f3-4f8c-ac45-94200964fdba"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.296106 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac189e2d-a0f3-4f8c-ac45-94200964fdba-config" (OuterVolumeSpecName: "config") pod "ac189e2d-a0f3-4f8c-ac45-94200964fdba" (UID: "ac189e2d-a0f3-4f8c-ac45-94200964fdba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.298359 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac189e2d-a0f3-4f8c-ac45-94200964fdba-kube-api-access-vcbkf" (OuterVolumeSpecName: "kube-api-access-vcbkf") pod "ac189e2d-a0f3-4f8c-ac45-94200964fdba" (UID: "ac189e2d-a0f3-4f8c-ac45-94200964fdba"). InnerVolumeSpecName "kube-api-access-vcbkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.298416 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac189e2d-a0f3-4f8c-ac45-94200964fdba-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ac189e2d-a0f3-4f8c-ac45-94200964fdba" (UID: "ac189e2d-a0f3-4f8c-ac45-94200964fdba"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.396335 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/311e6ae2-bf41-4276-b1ea-580b5d03da60-config\") pod \"route-controller-manager-668f8c44bc-tkr7z\" (UID: \"311e6ae2-bf41-4276-b1ea-580b5d03da60\") " pod="openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.396393 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/311e6ae2-bf41-4276-b1ea-580b5d03da60-serving-cert\") pod \"route-controller-manager-668f8c44bc-tkr7z\" (UID: \"311e6ae2-bf41-4276-b1ea-580b5d03da60\") " pod="openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.396419 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg86w\" (UniqueName: \"kubernetes.io/projected/311e6ae2-bf41-4276-b1ea-580b5d03da60-kube-api-access-qg86w\") pod \"route-controller-manager-668f8c44bc-tkr7z\" (UID: \"311e6ae2-bf41-4276-b1ea-580b5d03da60\") " pod="openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.396444 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/311e6ae2-bf41-4276-b1ea-580b5d03da60-client-ca\") pod \"route-controller-manager-668f8c44bc-tkr7z\" (UID: \"311e6ae2-bf41-4276-b1ea-580b5d03da60\") " pod="openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.396509 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcbkf\" (UniqueName: \"kubernetes.io/projected/ac189e2d-a0f3-4f8c-ac45-94200964fdba-kube-api-access-vcbkf\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.396538 4696 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac189e2d-a0f3-4f8c-ac45-94200964fdba-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.396550 4696 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac189e2d-a0f3-4f8c-ac45-94200964fdba-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.396558 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac189e2d-a0f3-4f8c-ac45-94200964fdba-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.396570 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac189e2d-a0f3-4f8c-ac45-94200964fdba-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.397419 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/311e6ae2-bf41-4276-b1ea-580b5d03da60-client-ca\") pod \"route-controller-manager-668f8c44bc-tkr7z\" (UID: \"311e6ae2-bf41-4276-b1ea-580b5d03da60\") " pod="openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.397710 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/311e6ae2-bf41-4276-b1ea-580b5d03da60-config\") pod \"route-controller-manager-668f8c44bc-tkr7z\" (UID: \"311e6ae2-bf41-4276-b1ea-580b5d03da60\") " pod="openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.399736 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/311e6ae2-bf41-4276-b1ea-580b5d03da60-serving-cert\") pod \"route-controller-manager-668f8c44bc-tkr7z\" (UID: \"311e6ae2-bf41-4276-b1ea-580b5d03da60\") " pod="openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.413241 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg86w\" (UniqueName: \"kubernetes.io/projected/311e6ae2-bf41-4276-b1ea-580b5d03da60-kube-api-access-qg86w\") pod \"route-controller-manager-668f8c44bc-tkr7z\" (UID: \"311e6ae2-bf41-4276-b1ea-580b5d03da60\") " pod="openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.445532 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.455648 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" event={"ID":"ac189e2d-a0f3-4f8c-ac45-94200964fdba","Type":"ContainerDied","Data":"2d7429698fbbfbbd02f7842dda7b16f04d929f2e16f46bd4f33d1f416997667a"} Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.455694 4696 scope.go:117] "RemoveContainer" containerID="1fe86e01c1cbce64b6ed4a939be3703dc5e91d5bdaa8c08bb577f56c20d0e6c9" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.455780 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-767b455df8-rv4nr" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.464332 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" event={"ID":"fee54946-64cb-4fe5-a707-3c4625ec200d","Type":"ContainerDied","Data":"8b14c4f1854f98bf1a8d29e3065d31ee2a83ca987622bc37010c6b4a0d55f618"} Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.464384 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.487858 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-767b455df8-rv4nr"] Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.491775 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-767b455df8-rv4nr"] Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.507076 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs"] Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.510962 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6p846" Mar 18 15:40:04 crc kubenswrapper[4696]: I0318 15:40:04.511021 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs"] Mar 18 15:40:05 crc kubenswrapper[4696]: I0318 15:40:05.041888 4696 patch_prober.go:28] interesting pod/route-controller-manager-cb96cd574-cxrhs container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:40:05 crc kubenswrapper[4696]: I0318 15:40:05.041950 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-cb96cd574-cxrhs" podUID="fee54946-64cb-4fe5-a707-3c4625ec200d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 15:40:05 crc kubenswrapper[4696]: I0318 15:40:05.606299 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac189e2d-a0f3-4f8c-ac45-94200964fdba" path="/var/lib/kubelet/pods/ac189e2d-a0f3-4f8c-ac45-94200964fdba/volumes" Mar 18 15:40:05 crc kubenswrapper[4696]: I0318 15:40:05.606931 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fee54946-64cb-4fe5-a707-3c4625ec200d" path="/var/lib/kubelet/pods/fee54946-64cb-4fe5-a707-3c4625ec200d/volumes" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.021753 4696 scope.go:117] "RemoveContainer" containerID="44e77e8ffd9bed41a469cc83a5ab3c82871beb85f393bd92a7cd45553315aa87" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.060742 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cp998" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.221100 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qzs2\" (UniqueName: \"kubernetes.io/projected/0c1d314f-dde9-4056-964f-4eb911306306-kube-api-access-4qzs2\") pod \"0c1d314f-dde9-4056-964f-4eb911306306\" (UID: \"0c1d314f-dde9-4056-964f-4eb911306306\") " Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.221160 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c1d314f-dde9-4056-964f-4eb911306306-utilities\") pod \"0c1d314f-dde9-4056-964f-4eb911306306\" (UID: \"0c1d314f-dde9-4056-964f-4eb911306306\") " Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.221245 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c1d314f-dde9-4056-964f-4eb911306306-catalog-content\") pod \"0c1d314f-dde9-4056-964f-4eb911306306\" (UID: \"0c1d314f-dde9-4056-964f-4eb911306306\") " Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.222362 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c1d314f-dde9-4056-964f-4eb911306306-utilities" (OuterVolumeSpecName: "utilities") pod "0c1d314f-dde9-4056-964f-4eb911306306" (UID: "0c1d314f-dde9-4056-964f-4eb911306306"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.226637 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c1d314f-dde9-4056-964f-4eb911306306-kube-api-access-4qzs2" (OuterVolumeSpecName: "kube-api-access-4qzs2") pod "0c1d314f-dde9-4056-964f-4eb911306306" (UID: "0c1d314f-dde9-4056-964f-4eb911306306"). InnerVolumeSpecName "kube-api-access-4qzs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.259085 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c1d314f-dde9-4056-964f-4eb911306306-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c1d314f-dde9-4056-964f-4eb911306306" (UID: "0c1d314f-dde9-4056-964f-4eb911306306"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.296836 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564140-vkm8r"] Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.322743 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qzs2\" (UniqueName: \"kubernetes.io/projected/0c1d314f-dde9-4056-964f-4eb911306306-kube-api-access-4qzs2\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.322776 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c1d314f-dde9-4056-964f-4eb911306306-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.322789 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c1d314f-dde9-4056-964f-4eb911306306-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:06 crc kubenswrapper[4696]: W0318 15:40:06.344340 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2528428_1e7c_49d7_8f64_d38ec08d18a7.slice/crio-7431466f28925201344069ecf1ab0b9f84ead0772d8aa8755ab965a2dbe7b7c2 WatchSource:0}: Error finding container 7431466f28925201344069ecf1ab0b9f84ead0772d8aa8755ab965a2dbe7b7c2: Status 404 returned error can't find the container with id 7431466f28925201344069ecf1ab0b9f84ead0772d8aa8755ab965a2dbe7b7c2 Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.487406 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564140-vkm8r" event={"ID":"a2528428-1e7c-49d7-8f64-d38ec08d18a7","Type":"ContainerStarted","Data":"7431466f28925201344069ecf1ab0b9f84ead0772d8aa8755ab965a2dbe7b7c2"} Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.490274 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cp998" event={"ID":"0c1d314f-dde9-4056-964f-4eb911306306","Type":"ContainerDied","Data":"04e956342d36a1e04a03cb5211dd771d9225d1be409e1825872f4536a0d7e07b"} Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.490297 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cp998" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.490333 4696 scope.go:117] "RemoveContainer" containerID="1071d90637ff828e097cb2c4d40609cca6865653f7379f35c25d5395c293a65f" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.523744 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z"] Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.523777 4696 scope.go:117] "RemoveContainer" containerID="f57af75a2d23f1374310594a4bb4d88cb0ff8d3b0801b0a531d4e216b0e18598" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.539577 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cp998"] Mar 18 15:40:06 crc kubenswrapper[4696]: W0318 15:40:06.542185 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod311e6ae2_bf41_4276_b1ea_580b5d03da60.slice/crio-75c47cf206ada30f8a41b3566b16b0225e1839b29c518a97751b634ed37a4552 WatchSource:0}: Error finding container 75c47cf206ada30f8a41b3566b16b0225e1839b29c518a97751b634ed37a4552: Status 404 returned error can't find the container with id 75c47cf206ada30f8a41b3566b16b0225e1839b29c518a97751b634ed37a4552 Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.542344 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cp998"] Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.556898 4696 scope.go:117] "RemoveContainer" containerID="6907d3f478ae9a354d8ab82d2ef1828baf9eea72a4227fa132d734c848ca50ac" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.843473 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-685db9f869-5vvjn"] Mar 18 15:40:06 crc kubenswrapper[4696]: E0318 15:40:06.843995 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c1d314f-dde9-4056-964f-4eb911306306" containerName="extract-utilities" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.844006 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c1d314f-dde9-4056-964f-4eb911306306" containerName="extract-utilities" Mar 18 15:40:06 crc kubenswrapper[4696]: E0318 15:40:06.844021 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c1d314f-dde9-4056-964f-4eb911306306" containerName="registry-server" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.844027 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c1d314f-dde9-4056-964f-4eb911306306" containerName="registry-server" Mar 18 15:40:06 crc kubenswrapper[4696]: E0318 15:40:06.844039 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c1d314f-dde9-4056-964f-4eb911306306" containerName="extract-content" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.844046 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c1d314f-dde9-4056-964f-4eb911306306" containerName="extract-content" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.844221 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c1d314f-dde9-4056-964f-4eb911306306" containerName="registry-server" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.844758 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.846708 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.847792 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.848249 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.850413 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.850808 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.851507 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.856424 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.858436 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-685db9f869-5vvjn"] Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.930655 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-config\") pod \"controller-manager-685db9f869-5vvjn\" (UID: \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\") " pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.930711 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-serving-cert\") pod \"controller-manager-685db9f869-5vvjn\" (UID: \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\") " pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.930738 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-client-ca\") pod \"controller-manager-685db9f869-5vvjn\" (UID: \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\") " pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.930809 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5rzg\" (UniqueName: \"kubernetes.io/projected/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-kube-api-access-t5rzg\") pod \"controller-manager-685db9f869-5vvjn\" (UID: \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\") " pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" Mar 18 15:40:06 crc kubenswrapper[4696]: I0318 15:40:06.930885 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-proxy-ca-bundles\") pod \"controller-manager-685db9f869-5vvjn\" (UID: \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\") " pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.037434 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-proxy-ca-bundles\") pod \"controller-manager-685db9f869-5vvjn\" (UID: \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\") " pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.037503 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-config\") pod \"controller-manager-685db9f869-5vvjn\" (UID: \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\") " pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.037559 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-serving-cert\") pod \"controller-manager-685db9f869-5vvjn\" (UID: \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\") " pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.037580 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-client-ca\") pod \"controller-manager-685db9f869-5vvjn\" (UID: \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\") " pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.037630 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5rzg\" (UniqueName: \"kubernetes.io/projected/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-kube-api-access-t5rzg\") pod \"controller-manager-685db9f869-5vvjn\" (UID: \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\") " pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.040249 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-config\") pod \"controller-manager-685db9f869-5vvjn\" (UID: \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\") " pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.043097 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-client-ca\") pod \"controller-manager-685db9f869-5vvjn\" (UID: \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\") " pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.043384 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-proxy-ca-bundles\") pod \"controller-manager-685db9f869-5vvjn\" (UID: \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\") " pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.044358 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-serving-cert\") pod \"controller-manager-685db9f869-5vvjn\" (UID: \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\") " pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.057477 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5rzg\" (UniqueName: \"kubernetes.io/projected/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-kube-api-access-t5rzg\") pod \"controller-manager-685db9f869-5vvjn\" (UID: \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\") " pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.271829 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.475036 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-685db9f869-5vvjn"] Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.516194 4696 generic.go:334] "Generic (PLEG): container finished" podID="aa06cecb-1f9f-431d-933f-0e87033cd695" containerID="00957881a8f41b1f0680c35ee86ecd7b138e47286b1503e74470035ed82a453b" exitCode=0 Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.516276 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xd5v7" event={"ID":"aa06cecb-1f9f-431d-933f-0e87033cd695","Type":"ContainerDied","Data":"00957881a8f41b1f0680c35ee86ecd7b138e47286b1503e74470035ed82a453b"} Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.519899 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q82hl" event={"ID":"cfaa1769-2588-440e-9757-ded58dcb0ac3","Type":"ContainerStarted","Data":"320f3e588eb26897c29304c277c1200a3efd1ba29ef03a63bd342c8457d80b1e"} Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.525103 4696 generic.go:334] "Generic (PLEG): container finished" podID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" containerID="97d41c253a76ab8ed95206382e7bde60ac2c022e3508441d8d60b055243423b3" exitCode=0 Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.525165 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjnmp" event={"ID":"b0569eea-b948-4633-91d7-4ebfa02d5a8b","Type":"ContainerDied","Data":"97d41c253a76ab8ed95206382e7bde60ac2c022e3508441d8d60b055243423b3"} Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.527733 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" event={"ID":"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22","Type":"ContainerStarted","Data":"475af7355a402c231ba353dd04a17eab10b35f20737c94cb31b7ff7784eb2b2d"} Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.555864 4696 generic.go:334] "Generic (PLEG): container finished" podID="459c4b74-f710-4b3a-b053-8b0326b87cb5" containerID="60e81796c613617c7df3de30b93abd6d937d34a1394fd7f6728d634d34f7268e" exitCode=0 Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.555980 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ldd6n" event={"ID":"459c4b74-f710-4b3a-b053-8b0326b87cb5","Type":"ContainerDied","Data":"60e81796c613617c7df3de30b93abd6d937d34a1394fd7f6728d634d34f7268e"} Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.570278 4696 generic.go:334] "Generic (PLEG): container finished" podID="4e22cc1d-032f-4f3a-a0ca-51708beef610" containerID="b31f83c6879b07215a158be92ed20303fe502e50aa0d6b9736f1bd1a50acfba1" exitCode=0 Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.570395 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tkdr" event={"ID":"4e22cc1d-032f-4f3a-a0ca-51708beef610","Type":"ContainerDied","Data":"b31f83c6879b07215a158be92ed20303fe502e50aa0d6b9736f1bd1a50acfba1"} Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.588937 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nsq74" event={"ID":"c9ddf1c7-0e1d-4f54-a93d-1665148569b2","Type":"ContainerDied","Data":"c895ffd8b16678019de16ad285ff0f2b53dcfb5c7d972576c54fa03c72e9ff74"} Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.588904 4696 generic.go:334] "Generic (PLEG): container finished" podID="c9ddf1c7-0e1d-4f54-a93d-1665148569b2" containerID="c895ffd8b16678019de16ad285ff0f2b53dcfb5c7d972576c54fa03c72e9ff74" exitCode=0 Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.593731 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z" event={"ID":"311e6ae2-bf41-4276-b1ea-580b5d03da60","Type":"ContainerStarted","Data":"ad41cbde589f68d4286518edbae1c1b1e4a7bee65e99523d52b1d0db37979dda"} Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.593769 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z" event={"ID":"311e6ae2-bf41-4276-b1ea-580b5d03da60","Type":"ContainerStarted","Data":"75c47cf206ada30f8a41b3566b16b0225e1839b29c518a97751b634ed37a4552"} Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.594084 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z" Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.605885 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c1d314f-dde9-4056-964f-4eb911306306" path="/var/lib/kubelet/pods/0c1d314f-dde9-4056-964f-4eb911306306/volumes" Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.654858 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z" Mar 18 15:40:07 crc kubenswrapper[4696]: I0318 15:40:07.693060 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z" podStartSLOduration=12.693028261 podStartE2EDuration="12.693028261s" podCreationTimestamp="2026-03-18 15:39:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:07.685888888 +0000 UTC m=+250.692063104" watchObservedRunningTime="2026-03-18 15:40:07.693028261 +0000 UTC m=+250.699202467" Mar 18 15:40:08 crc kubenswrapper[4696]: I0318 15:40:08.600332 4696 generic.go:334] "Generic (PLEG): container finished" podID="a2528428-1e7c-49d7-8f64-d38ec08d18a7" containerID="bf4ca4aca02cf88e131f2eb9d2bfceeaf570ffdadc28bf974e59c3bd0e189140" exitCode=0 Mar 18 15:40:08 crc kubenswrapper[4696]: I0318 15:40:08.600445 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564140-vkm8r" event={"ID":"a2528428-1e7c-49d7-8f64-d38ec08d18a7","Type":"ContainerDied","Data":"bf4ca4aca02cf88e131f2eb9d2bfceeaf570ffdadc28bf974e59c3bd0e189140"} Mar 18 15:40:08 crc kubenswrapper[4696]: I0318 15:40:08.606206 4696 generic.go:334] "Generic (PLEG): container finished" podID="cfaa1769-2588-440e-9757-ded58dcb0ac3" containerID="320f3e588eb26897c29304c277c1200a3efd1ba29ef03a63bd342c8457d80b1e" exitCode=0 Mar 18 15:40:08 crc kubenswrapper[4696]: I0318 15:40:08.606337 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q82hl" event={"ID":"cfaa1769-2588-440e-9757-ded58dcb0ac3","Type":"ContainerDied","Data":"320f3e588eb26897c29304c277c1200a3efd1ba29ef03a63bd342c8457d80b1e"} Mar 18 15:40:08 crc kubenswrapper[4696]: I0318 15:40:08.611635 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" event={"ID":"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22","Type":"ContainerStarted","Data":"60e7676cdc9ecf9e217e42e3899611078dc7c86f82f3c39474ee5549e39eb7ab"} Mar 18 15:40:08 crc kubenswrapper[4696]: I0318 15:40:08.611680 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" Mar 18 15:40:08 crc kubenswrapper[4696]: I0318 15:40:08.624752 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" Mar 18 15:40:08 crc kubenswrapper[4696]: I0318 15:40:08.644426 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" podStartSLOduration=13.644405512 podStartE2EDuration="13.644405512s" podCreationTimestamp="2026-03-18 15:39:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:08.643324184 +0000 UTC m=+251.649498400" watchObservedRunningTime="2026-03-18 15:40:08.644405512 +0000 UTC m=+251.650579718" Mar 18 15:40:09 crc kubenswrapper[4696]: I0318 15:40:09.617452 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tkdr" event={"ID":"4e22cc1d-032f-4f3a-a0ca-51708beef610","Type":"ContainerStarted","Data":"7a7de1246e6e5bff29d1523d9413ddf3c1548e54f064c428f3f6b49119a36f79"} Mar 18 15:40:09 crc kubenswrapper[4696]: I0318 15:40:09.620053 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nsq74" event={"ID":"c9ddf1c7-0e1d-4f54-a93d-1665148569b2","Type":"ContainerStarted","Data":"4ac84bc9741d1fba5138ba79059a6ed4b65e1c460669ecbf4510c7969d5913a2"} Mar 18 15:40:09 crc kubenswrapper[4696]: I0318 15:40:09.622982 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ldd6n" event={"ID":"459c4b74-f710-4b3a-b053-8b0326b87cb5","Type":"ContainerStarted","Data":"72e86da246f5b4b74e974090a72f4fa2dfdd3bc490ac7e7315853777471d8235"} Mar 18 15:40:09 crc kubenswrapper[4696]: I0318 15:40:09.625540 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q82hl" event={"ID":"cfaa1769-2588-440e-9757-ded58dcb0ac3","Type":"ContainerStarted","Data":"87f868d2e0b51e30a352e6674a93813e1cf42c1fb914fc6fa6f44cc5e05c9d78"} Mar 18 15:40:09 crc kubenswrapper[4696]: I0318 15:40:09.628976 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjnmp" event={"ID":"b0569eea-b948-4633-91d7-4ebfa02d5a8b","Type":"ContainerStarted","Data":"76569577d3bd4f92b9aa620a4f55bfa105a93cf68b4f61e903d3f234d08696ae"} Mar 18 15:40:09 crc kubenswrapper[4696]: I0318 15:40:09.631154 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xd5v7" event={"ID":"aa06cecb-1f9f-431d-933f-0e87033cd695","Type":"ContainerStarted","Data":"d5fae5a1b8c50b5b00df7d9c7769430f8c57dcdd7b6c1a1483f723a168ab920b"} Mar 18 15:40:09 crc kubenswrapper[4696]: I0318 15:40:09.646696 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8tkdr" podStartSLOduration=2.431708209 podStartE2EDuration="1m9.646670291s" podCreationTimestamp="2026-03-18 15:39:00 +0000 UTC" firstStartedPulling="2026-03-18 15:39:01.683295647 +0000 UTC m=+184.689469853" lastFinishedPulling="2026-03-18 15:40:08.898257729 +0000 UTC m=+251.904431935" observedRunningTime="2026-03-18 15:40:09.642161265 +0000 UTC m=+252.648335471" watchObservedRunningTime="2026-03-18 15:40:09.646670291 +0000 UTC m=+252.652844497" Mar 18 15:40:09 crc kubenswrapper[4696]: I0318 15:40:09.666459 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ldd6n" podStartSLOduration=3.699578845 podStartE2EDuration="1m9.66644096s" podCreationTimestamp="2026-03-18 15:39:00 +0000 UTC" firstStartedPulling="2026-03-18 15:39:02.701118163 +0000 UTC m=+185.707292369" lastFinishedPulling="2026-03-18 15:40:08.667980278 +0000 UTC m=+251.674154484" observedRunningTime="2026-03-18 15:40:09.666299776 +0000 UTC m=+252.672473982" watchObservedRunningTime="2026-03-18 15:40:09.66644096 +0000 UTC m=+252.672615166" Mar 18 15:40:09 crc kubenswrapper[4696]: I0318 15:40:09.696319 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sjnmp" podStartSLOduration=3.886334582 podStartE2EDuration="1m9.696301547s" podCreationTimestamp="2026-03-18 15:39:00 +0000 UTC" firstStartedPulling="2026-03-18 15:39:02.708434577 +0000 UTC m=+185.714608783" lastFinishedPulling="2026-03-18 15:40:08.518401542 +0000 UTC m=+251.524575748" observedRunningTime="2026-03-18 15:40:09.692792717 +0000 UTC m=+252.698966933" watchObservedRunningTime="2026-03-18 15:40:09.696301547 +0000 UTC m=+252.702475753" Mar 18 15:40:09 crc kubenswrapper[4696]: I0318 15:40:09.719730 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xd5v7" podStartSLOduration=3.787295069 podStartE2EDuration="1m9.719702849s" podCreationTimestamp="2026-03-18 15:39:00 +0000 UTC" firstStartedPulling="2026-03-18 15:39:02.720586393 +0000 UTC m=+185.726760599" lastFinishedPulling="2026-03-18 15:40:08.652994173 +0000 UTC m=+251.659168379" observedRunningTime="2026-03-18 15:40:09.717570964 +0000 UTC m=+252.723745170" watchObservedRunningTime="2026-03-18 15:40:09.719702849 +0000 UTC m=+252.725877055" Mar 18 15:40:09 crc kubenswrapper[4696]: I0318 15:40:09.740701 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nsq74" podStartSLOduration=2.854285282 podStartE2EDuration="1m7.740677778s" podCreationTimestamp="2026-03-18 15:39:02 +0000 UTC" firstStartedPulling="2026-03-18 15:39:03.751790886 +0000 UTC m=+186.757965092" lastFinishedPulling="2026-03-18 15:40:08.638183382 +0000 UTC m=+251.644357588" observedRunningTime="2026-03-18 15:40:09.740420802 +0000 UTC m=+252.746595008" watchObservedRunningTime="2026-03-18 15:40:09.740677778 +0000 UTC m=+252.746851984" Mar 18 15:40:09 crc kubenswrapper[4696]: I0318 15:40:09.969229 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564140-vkm8r" Mar 18 15:40:09 crc kubenswrapper[4696]: I0318 15:40:09.988652 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q82hl" podStartSLOduration=3.746221817 podStartE2EDuration="1m6.988635394s" podCreationTimestamp="2026-03-18 15:39:03 +0000 UTC" firstStartedPulling="2026-03-18 15:39:05.792492545 +0000 UTC m=+188.798666751" lastFinishedPulling="2026-03-18 15:40:09.034906122 +0000 UTC m=+252.041080328" observedRunningTime="2026-03-18 15:40:09.781021736 +0000 UTC m=+252.787195942" watchObservedRunningTime="2026-03-18 15:40:09.988635394 +0000 UTC m=+252.994809600" Mar 18 15:40:10 crc kubenswrapper[4696]: I0318 15:40:10.088063 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxcw2\" (UniqueName: \"kubernetes.io/projected/a2528428-1e7c-49d7-8f64-d38ec08d18a7-kube-api-access-lxcw2\") pod \"a2528428-1e7c-49d7-8f64-d38ec08d18a7\" (UID: \"a2528428-1e7c-49d7-8f64-d38ec08d18a7\") " Mar 18 15:40:10 crc kubenswrapper[4696]: I0318 15:40:10.093698 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2528428-1e7c-49d7-8f64-d38ec08d18a7-kube-api-access-lxcw2" (OuterVolumeSpecName: "kube-api-access-lxcw2") pod "a2528428-1e7c-49d7-8f64-d38ec08d18a7" (UID: "a2528428-1e7c-49d7-8f64-d38ec08d18a7"). InnerVolumeSpecName "kube-api-access-lxcw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:40:10 crc kubenswrapper[4696]: I0318 15:40:10.189733 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxcw2\" (UniqueName: \"kubernetes.io/projected/a2528428-1e7c-49d7-8f64-d38ec08d18a7-kube-api-access-lxcw2\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:10 crc kubenswrapper[4696]: I0318 15:40:10.585762 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xd5v7" Mar 18 15:40:10 crc kubenswrapper[4696]: I0318 15:40:10.585809 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xd5v7" Mar 18 15:40:10 crc kubenswrapper[4696]: I0318 15:40:10.637078 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564140-vkm8r" event={"ID":"a2528428-1e7c-49d7-8f64-d38ec08d18a7","Type":"ContainerDied","Data":"7431466f28925201344069ecf1ab0b9f84ead0772d8aa8755ab965a2dbe7b7c2"} Mar 18 15:40:10 crc kubenswrapper[4696]: I0318 15:40:10.637150 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7431466f28925201344069ecf1ab0b9f84ead0772d8aa8755ab965a2dbe7b7c2" Mar 18 15:40:10 crc kubenswrapper[4696]: I0318 15:40:10.637236 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564140-vkm8r" Mar 18 15:40:10 crc kubenswrapper[4696]: I0318 15:40:10.756507 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8tkdr" Mar 18 15:40:10 crc kubenswrapper[4696]: I0318 15:40:10.756581 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8tkdr" Mar 18 15:40:10 crc kubenswrapper[4696]: I0318 15:40:10.990671 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sjnmp" Mar 18 15:40:10 crc kubenswrapper[4696]: I0318 15:40:10.990758 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sjnmp" Mar 18 15:40:11 crc kubenswrapper[4696]: I0318 15:40:11.044151 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sjnmp" Mar 18 15:40:11 crc kubenswrapper[4696]: I0318 15:40:11.258479 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ldd6n" Mar 18 15:40:11 crc kubenswrapper[4696]: I0318 15:40:11.258830 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ldd6n" Mar 18 15:40:11 crc kubenswrapper[4696]: I0318 15:40:11.302183 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ldd6n" Mar 18 15:40:11 crc kubenswrapper[4696]: I0318 15:40:11.653811 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-xd5v7" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" containerName="registry-server" probeResult="failure" output=< Mar 18 15:40:11 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Mar 18 15:40:11 crc kubenswrapper[4696]: > Mar 18 15:40:11 crc kubenswrapper[4696]: I0318 15:40:11.798818 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-8tkdr" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" containerName="registry-server" probeResult="failure" output=< Mar 18 15:40:11 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Mar 18 15:40:11 crc kubenswrapper[4696]: > Mar 18 15:40:12 crc kubenswrapper[4696]: I0318 15:40:12.184382 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:40:12 crc kubenswrapper[4696]: I0318 15:40:12.184450 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:40:12 crc kubenswrapper[4696]: I0318 15:40:12.745117 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nsq74" Mar 18 15:40:12 crc kubenswrapper[4696]: I0318 15:40:12.745174 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nsq74" Mar 18 15:40:12 crc kubenswrapper[4696]: I0318 15:40:12.788720 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nsq74" Mar 18 15:40:13 crc kubenswrapper[4696]: I0318 15:40:13.699503 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nsq74" Mar 18 15:40:14 crc kubenswrapper[4696]: I0318 15:40:14.232469 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q82hl" Mar 18 15:40:14 crc kubenswrapper[4696]: I0318 15:40:14.232546 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q82hl" Mar 18 15:40:14 crc kubenswrapper[4696]: I0318 15:40:14.995307 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-685db9f869-5vvjn"] Mar 18 15:40:14 crc kubenswrapper[4696]: I0318 15:40:14.995823 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" podUID="0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22" containerName="controller-manager" containerID="cri-o://60e7676cdc9ecf9e217e42e3899611078dc7c86f82f3c39474ee5549e39eb7ab" gracePeriod=30 Mar 18 15:40:15 crc kubenswrapper[4696]: I0318 15:40:15.101231 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z"] Mar 18 15:40:15 crc kubenswrapper[4696]: I0318 15:40:15.101443 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z" podUID="311e6ae2-bf41-4276-b1ea-580b5d03da60" containerName="route-controller-manager" containerID="cri-o://ad41cbde589f68d4286518edbae1c1b1e4a7bee65e99523d52b1d0db37979dda" gracePeriod=30 Mar 18 15:40:15 crc kubenswrapper[4696]: I0318 15:40:15.269155 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-q82hl" podUID="cfaa1769-2588-440e-9757-ded58dcb0ac3" containerName="registry-server" probeResult="failure" output=< Mar 18 15:40:15 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Mar 18 15:40:15 crc kubenswrapper[4696]: > Mar 18 15:40:15 crc kubenswrapper[4696]: I0318 15:40:15.292676 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:15 crc kubenswrapper[4696]: I0318 15:40:15.674530 4696 generic.go:334] "Generic (PLEG): container finished" podID="311e6ae2-bf41-4276-b1ea-580b5d03da60" containerID="ad41cbde589f68d4286518edbae1c1b1e4a7bee65e99523d52b1d0db37979dda" exitCode=0 Mar 18 15:40:15 crc kubenswrapper[4696]: I0318 15:40:15.674639 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z" event={"ID":"311e6ae2-bf41-4276-b1ea-580b5d03da60","Type":"ContainerDied","Data":"ad41cbde589f68d4286518edbae1c1b1e4a7bee65e99523d52b1d0db37979dda"} Mar 18 15:40:15 crc kubenswrapper[4696]: I0318 15:40:15.676047 4696 generic.go:334] "Generic (PLEG): container finished" podID="0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22" containerID="60e7676cdc9ecf9e217e42e3899611078dc7c86f82f3c39474ee5549e39eb7ab" exitCode=0 Mar 18 15:40:15 crc kubenswrapper[4696]: I0318 15:40:15.676085 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" event={"ID":"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22","Type":"ContainerDied","Data":"60e7676cdc9ecf9e217e42e3899611078dc7c86f82f3c39474ee5549e39eb7ab"} Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.304636 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.341859 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-58dd9d8f5-nqncp"] Mar 18 15:40:16 crc kubenswrapper[4696]: E0318 15:40:16.342153 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2528428-1e7c-49d7-8f64-d38ec08d18a7" containerName="oc" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.342175 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2528428-1e7c-49d7-8f64-d38ec08d18a7" containerName="oc" Mar 18 15:40:16 crc kubenswrapper[4696]: E0318 15:40:16.342193 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22" containerName="controller-manager" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.342202 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22" containerName="controller-manager" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.342364 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22" containerName="controller-manager" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.342380 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2528428-1e7c-49d7-8f64-d38ec08d18a7" containerName="oc" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.342793 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58dd9d8f5-nqncp" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.346305 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58dd9d8f5-nqncp"] Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.376243 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-serving-cert\") pod \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\" (UID: \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\") " Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.376298 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-client-ca\") pod \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\" (UID: \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\") " Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.376339 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5rzg\" (UniqueName: \"kubernetes.io/projected/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-kube-api-access-t5rzg\") pod \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\" (UID: \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\") " Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.376365 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-config\") pod \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\" (UID: \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\") " Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.376382 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-proxy-ca-bundles\") pod \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\" (UID: \"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22\") " Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.376470 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c80d683c-b706-444b-8c18-0c16da50d688-proxy-ca-bundles\") pod \"controller-manager-58dd9d8f5-nqncp\" (UID: \"c80d683c-b706-444b-8c18-0c16da50d688\") " pod="openshift-controller-manager/controller-manager-58dd9d8f5-nqncp" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.376503 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c80d683c-b706-444b-8c18-0c16da50d688-config\") pod \"controller-manager-58dd9d8f5-nqncp\" (UID: \"c80d683c-b706-444b-8c18-0c16da50d688\") " pod="openshift-controller-manager/controller-manager-58dd9d8f5-nqncp" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.376547 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c80d683c-b706-444b-8c18-0c16da50d688-serving-cert\") pod \"controller-manager-58dd9d8f5-nqncp\" (UID: \"c80d683c-b706-444b-8c18-0c16da50d688\") " pod="openshift-controller-manager/controller-manager-58dd9d8f5-nqncp" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.376595 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c80d683c-b706-444b-8c18-0c16da50d688-client-ca\") pod \"controller-manager-58dd9d8f5-nqncp\" (UID: \"c80d683c-b706-444b-8c18-0c16da50d688\") " pod="openshift-controller-manager/controller-manager-58dd9d8f5-nqncp" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.376611 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsg8k\" (UniqueName: \"kubernetes.io/projected/c80d683c-b706-444b-8c18-0c16da50d688-kube-api-access-zsg8k\") pod \"controller-manager-58dd9d8f5-nqncp\" (UID: \"c80d683c-b706-444b-8c18-0c16da50d688\") " pod="openshift-controller-manager/controller-manager-58dd9d8f5-nqncp" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.377430 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-client-ca" (OuterVolumeSpecName: "client-ca") pod "0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22" (UID: "0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.378239 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22" (UID: "0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.378304 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-config" (OuterVolumeSpecName: "config") pod "0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22" (UID: "0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.382661 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-kube-api-access-t5rzg" (OuterVolumeSpecName: "kube-api-access-t5rzg") pod "0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22" (UID: "0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22"). InnerVolumeSpecName "kube-api-access-t5rzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.382898 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22" (UID: "0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.414069 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.477138 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/311e6ae2-bf41-4276-b1ea-580b5d03da60-serving-cert\") pod \"311e6ae2-bf41-4276-b1ea-580b5d03da60\" (UID: \"311e6ae2-bf41-4276-b1ea-580b5d03da60\") " Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.477279 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg86w\" (UniqueName: \"kubernetes.io/projected/311e6ae2-bf41-4276-b1ea-580b5d03da60-kube-api-access-qg86w\") pod \"311e6ae2-bf41-4276-b1ea-580b5d03da60\" (UID: \"311e6ae2-bf41-4276-b1ea-580b5d03da60\") " Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.477341 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/311e6ae2-bf41-4276-b1ea-580b5d03da60-client-ca\") pod \"311e6ae2-bf41-4276-b1ea-580b5d03da60\" (UID: \"311e6ae2-bf41-4276-b1ea-580b5d03da60\") " Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.477395 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/311e6ae2-bf41-4276-b1ea-580b5d03da60-config\") pod \"311e6ae2-bf41-4276-b1ea-580b5d03da60\" (UID: \"311e6ae2-bf41-4276-b1ea-580b5d03da60\") " Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.477634 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c80d683c-b706-444b-8c18-0c16da50d688-proxy-ca-bundles\") pod \"controller-manager-58dd9d8f5-nqncp\" (UID: \"c80d683c-b706-444b-8c18-0c16da50d688\") " pod="openshift-controller-manager/controller-manager-58dd9d8f5-nqncp" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.477679 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c80d683c-b706-444b-8c18-0c16da50d688-config\") pod \"controller-manager-58dd9d8f5-nqncp\" (UID: \"c80d683c-b706-444b-8c18-0c16da50d688\") " pod="openshift-controller-manager/controller-manager-58dd9d8f5-nqncp" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.477719 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c80d683c-b706-444b-8c18-0c16da50d688-serving-cert\") pod \"controller-manager-58dd9d8f5-nqncp\" (UID: \"c80d683c-b706-444b-8c18-0c16da50d688\") " pod="openshift-controller-manager/controller-manager-58dd9d8f5-nqncp" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.477754 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c80d683c-b706-444b-8c18-0c16da50d688-client-ca\") pod \"controller-manager-58dd9d8f5-nqncp\" (UID: \"c80d683c-b706-444b-8c18-0c16da50d688\") " pod="openshift-controller-manager/controller-manager-58dd9d8f5-nqncp" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.477781 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsg8k\" (UniqueName: \"kubernetes.io/projected/c80d683c-b706-444b-8c18-0c16da50d688-kube-api-access-zsg8k\") pod \"controller-manager-58dd9d8f5-nqncp\" (UID: \"c80d683c-b706-444b-8c18-0c16da50d688\") " pod="openshift-controller-manager/controller-manager-58dd9d8f5-nqncp" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.477838 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.477852 4696 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.477865 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5rzg\" (UniqueName: \"kubernetes.io/projected/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-kube-api-access-t5rzg\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.477878 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.477892 4696 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.479743 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/311e6ae2-bf41-4276-b1ea-580b5d03da60-client-ca" (OuterVolumeSpecName: "client-ca") pod "311e6ae2-bf41-4276-b1ea-580b5d03da60" (UID: "311e6ae2-bf41-4276-b1ea-580b5d03da60"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.479869 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/311e6ae2-bf41-4276-b1ea-580b5d03da60-config" (OuterVolumeSpecName: "config") pod "311e6ae2-bf41-4276-b1ea-580b5d03da60" (UID: "311e6ae2-bf41-4276-b1ea-580b5d03da60"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.480237 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c80d683c-b706-444b-8c18-0c16da50d688-proxy-ca-bundles\") pod \"controller-manager-58dd9d8f5-nqncp\" (UID: \"c80d683c-b706-444b-8c18-0c16da50d688\") " pod="openshift-controller-manager/controller-manager-58dd9d8f5-nqncp" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.480472 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c80d683c-b706-444b-8c18-0c16da50d688-client-ca\") pod \"controller-manager-58dd9d8f5-nqncp\" (UID: \"c80d683c-b706-444b-8c18-0c16da50d688\") " pod="openshift-controller-manager/controller-manager-58dd9d8f5-nqncp" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.481350 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c80d683c-b706-444b-8c18-0c16da50d688-config\") pod \"controller-manager-58dd9d8f5-nqncp\" (UID: \"c80d683c-b706-444b-8c18-0c16da50d688\") " pod="openshift-controller-manager/controller-manager-58dd9d8f5-nqncp" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.481562 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/311e6ae2-bf41-4276-b1ea-580b5d03da60-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "311e6ae2-bf41-4276-b1ea-580b5d03da60" (UID: "311e6ae2-bf41-4276-b1ea-580b5d03da60"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.482208 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/311e6ae2-bf41-4276-b1ea-580b5d03da60-kube-api-access-qg86w" (OuterVolumeSpecName: "kube-api-access-qg86w") pod "311e6ae2-bf41-4276-b1ea-580b5d03da60" (UID: "311e6ae2-bf41-4276-b1ea-580b5d03da60"). InnerVolumeSpecName "kube-api-access-qg86w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.482860 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c80d683c-b706-444b-8c18-0c16da50d688-serving-cert\") pod \"controller-manager-58dd9d8f5-nqncp\" (UID: \"c80d683c-b706-444b-8c18-0c16da50d688\") " pod="openshift-controller-manager/controller-manager-58dd9d8f5-nqncp" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.494494 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsg8k\" (UniqueName: \"kubernetes.io/projected/c80d683c-b706-444b-8c18-0c16da50d688-kube-api-access-zsg8k\") pod \"controller-manager-58dd9d8f5-nqncp\" (UID: \"c80d683c-b706-444b-8c18-0c16da50d688\") " pod="openshift-controller-manager/controller-manager-58dd9d8f5-nqncp" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.579315 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/311e6ae2-bf41-4276-b1ea-580b5d03da60-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.579359 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg86w\" (UniqueName: \"kubernetes.io/projected/311e6ae2-bf41-4276-b1ea-580b5d03da60-kube-api-access-qg86w\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.579369 4696 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/311e6ae2-bf41-4276-b1ea-580b5d03da60-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.579378 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/311e6ae2-bf41-4276-b1ea-580b5d03da60-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.662767 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58dd9d8f5-nqncp" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.682482 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.682497 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z" event={"ID":"311e6ae2-bf41-4276-b1ea-580b5d03da60","Type":"ContainerDied","Data":"75c47cf206ada30f8a41b3566b16b0225e1839b29c518a97751b634ed37a4552"} Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.682577 4696 scope.go:117] "RemoveContainer" containerID="ad41cbde589f68d4286518edbae1c1b1e4a7bee65e99523d52b1d0db37979dda" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.684746 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" event={"ID":"0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22","Type":"ContainerDied","Data":"475af7355a402c231ba353dd04a17eab10b35f20737c94cb31b7ff7784eb2b2d"} Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.684773 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-685db9f869-5vvjn" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.709324 4696 scope.go:117] "RemoveContainer" containerID="60e7676cdc9ecf9e217e42e3899611078dc7c86f82f3c39474ee5549e39eb7ab" Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.712409 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z"] Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.723815 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-668f8c44bc-tkr7z"] Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.728809 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-685db9f869-5vvjn"] Mar 18 15:40:16 crc kubenswrapper[4696]: I0318 15:40:16.731812 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-685db9f869-5vvjn"] Mar 18 15:40:17 crc kubenswrapper[4696]: I0318 15:40:17.094090 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58dd9d8f5-nqncp"] Mar 18 15:40:17 crc kubenswrapper[4696]: W0318 15:40:17.097144 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc80d683c_b706_444b_8c18_0c16da50d688.slice/crio-c1d37c6e936945cde42177a8854edd5820de41670ef71d63bf815f740e44741a WatchSource:0}: Error finding container c1d37c6e936945cde42177a8854edd5820de41670ef71d63bf815f740e44741a: Status 404 returned error can't find the container with id c1d37c6e936945cde42177a8854edd5820de41670ef71d63bf815f740e44741a Mar 18 15:40:17 crc kubenswrapper[4696]: I0318 15:40:17.606894 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22" path="/var/lib/kubelet/pods/0d8f28cc-9b2b-42c7-afd8-ad3c32b17b22/volumes" Mar 18 15:40:17 crc kubenswrapper[4696]: I0318 15:40:17.607821 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="311e6ae2-bf41-4276-b1ea-580b5d03da60" path="/var/lib/kubelet/pods/311e6ae2-bf41-4276-b1ea-580b5d03da60/volumes" Mar 18 15:40:17 crc kubenswrapper[4696]: I0318 15:40:17.697510 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58dd9d8f5-nqncp" event={"ID":"c80d683c-b706-444b-8c18-0c16da50d688","Type":"ContainerStarted","Data":"c1d37c6e936945cde42177a8854edd5820de41670ef71d63bf815f740e44741a"} Mar 18 15:40:18 crc kubenswrapper[4696]: I0318 15:40:18.703774 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58dd9d8f5-nqncp" event={"ID":"c80d683c-b706-444b-8c18-0c16da50d688","Type":"ContainerStarted","Data":"f490f841864d5fb9766852c7ef926a9c4750a8ad8bc3691ee712e17e582bcdb6"} Mar 18 15:40:18 crc kubenswrapper[4696]: I0318 15:40:18.704069 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-58dd9d8f5-nqncp" Mar 18 15:40:18 crc kubenswrapper[4696]: I0318 15:40:18.712034 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-58dd9d8f5-nqncp" Mar 18 15:40:18 crc kubenswrapper[4696]: I0318 15:40:18.725865 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-58dd9d8f5-nqncp" podStartSLOduration=3.725850267 podStartE2EDuration="3.725850267s" podCreationTimestamp="2026-03-18 15:40:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:18.725022725 +0000 UTC m=+261.731196941" watchObservedRunningTime="2026-03-18 15:40:18.725850267 +0000 UTC m=+261.732024473" Mar 18 15:40:18 crc kubenswrapper[4696]: I0318 15:40:18.849464 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4"] Mar 18 15:40:18 crc kubenswrapper[4696]: E0318 15:40:18.849723 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="311e6ae2-bf41-4276-b1ea-580b5d03da60" containerName="route-controller-manager" Mar 18 15:40:18 crc kubenswrapper[4696]: I0318 15:40:18.849738 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="311e6ae2-bf41-4276-b1ea-580b5d03da60" containerName="route-controller-manager" Mar 18 15:40:18 crc kubenswrapper[4696]: I0318 15:40:18.849842 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="311e6ae2-bf41-4276-b1ea-580b5d03da60" containerName="route-controller-manager" Mar 18 15:40:18 crc kubenswrapper[4696]: I0318 15:40:18.850179 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" Mar 18 15:40:18 crc kubenswrapper[4696]: I0318 15:40:18.852499 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 15:40:18 crc kubenswrapper[4696]: I0318 15:40:18.855491 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 15:40:18 crc kubenswrapper[4696]: I0318 15:40:18.855784 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 15:40:18 crc kubenswrapper[4696]: I0318 15:40:18.855869 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 15:40:18 crc kubenswrapper[4696]: I0318 15:40:18.860029 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 15:40:18 crc kubenswrapper[4696]: I0318 15:40:18.860436 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 15:40:18 crc kubenswrapper[4696]: I0318 15:40:18.865757 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4"] Mar 18 15:40:18 crc kubenswrapper[4696]: I0318 15:40:18.906760 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83b8e906-da43-4ac7-8700-4689c9852803-serving-cert\") pod \"route-controller-manager-5c7b7b5585-55dj4\" (UID: \"83b8e906-da43-4ac7-8700-4689c9852803\") " pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" Mar 18 15:40:18 crc kubenswrapper[4696]: I0318 15:40:18.906815 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83b8e906-da43-4ac7-8700-4689c9852803-client-ca\") pod \"route-controller-manager-5c7b7b5585-55dj4\" (UID: \"83b8e906-da43-4ac7-8700-4689c9852803\") " pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" Mar 18 15:40:18 crc kubenswrapper[4696]: I0318 15:40:18.906876 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83b8e906-da43-4ac7-8700-4689c9852803-config\") pod \"route-controller-manager-5c7b7b5585-55dj4\" (UID: \"83b8e906-da43-4ac7-8700-4689c9852803\") " pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" Mar 18 15:40:18 crc kubenswrapper[4696]: I0318 15:40:18.906924 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nm66\" (UniqueName: \"kubernetes.io/projected/83b8e906-da43-4ac7-8700-4689c9852803-kube-api-access-9nm66\") pod \"route-controller-manager-5c7b7b5585-55dj4\" (UID: \"83b8e906-da43-4ac7-8700-4689c9852803\") " pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.008190 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83b8e906-da43-4ac7-8700-4689c9852803-serving-cert\") pod \"route-controller-manager-5c7b7b5585-55dj4\" (UID: \"83b8e906-da43-4ac7-8700-4689c9852803\") " pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.008290 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83b8e906-da43-4ac7-8700-4689c9852803-client-ca\") pod \"route-controller-manager-5c7b7b5585-55dj4\" (UID: \"83b8e906-da43-4ac7-8700-4689c9852803\") " pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.008326 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83b8e906-da43-4ac7-8700-4689c9852803-config\") pod \"route-controller-manager-5c7b7b5585-55dj4\" (UID: \"83b8e906-da43-4ac7-8700-4689c9852803\") " pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.008365 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nm66\" (UniqueName: \"kubernetes.io/projected/83b8e906-da43-4ac7-8700-4689c9852803-kube-api-access-9nm66\") pod \"route-controller-manager-5c7b7b5585-55dj4\" (UID: \"83b8e906-da43-4ac7-8700-4689c9852803\") " pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.010027 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83b8e906-da43-4ac7-8700-4689c9852803-client-ca\") pod \"route-controller-manager-5c7b7b5585-55dj4\" (UID: \"83b8e906-da43-4ac7-8700-4689c9852803\") " pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.010211 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83b8e906-da43-4ac7-8700-4689c9852803-config\") pod \"route-controller-manager-5c7b7b5585-55dj4\" (UID: \"83b8e906-da43-4ac7-8700-4689c9852803\") " pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.013815 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83b8e906-da43-4ac7-8700-4689c9852803-serving-cert\") pod \"route-controller-manager-5c7b7b5585-55dj4\" (UID: \"83b8e906-da43-4ac7-8700-4689c9852803\") " pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.023971 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nm66\" (UniqueName: \"kubernetes.io/projected/83b8e906-da43-4ac7-8700-4689c9852803-kube-api-access-9nm66\") pod \"route-controller-manager-5c7b7b5585-55dj4\" (UID: \"83b8e906-da43-4ac7-8700-4689c9852803\") " pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.165165 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.556541 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4"] Mar 18 15:40:19 crc kubenswrapper[4696]: W0318 15:40:19.563682 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83b8e906_da43_4ac7_8700_4689c9852803.slice/crio-0baa540de62a73e49920d8eec6ab65142ddb0d0ced25777fd271ec4cec5f5dcc WatchSource:0}: Error finding container 0baa540de62a73e49920d8eec6ab65142ddb0d0ced25777fd271ec4cec5f5dcc: Status 404 returned error can't find the container with id 0baa540de62a73e49920d8eec6ab65142ddb0d0ced25777fd271ec4cec5f5dcc Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.709181 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" event={"ID":"83b8e906-da43-4ac7-8700-4689c9852803","Type":"ContainerStarted","Data":"0baa540de62a73e49920d8eec6ab65142ddb0d0ced25777fd271ec4cec5f5dcc"} Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.782338 4696 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.783093 4696 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.783198 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.783463 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7" gracePeriod=15 Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.783501 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca" gracePeriod=15 Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.783546 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0" gracePeriod=15 Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.783574 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619" gracePeriod=15 Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.783604 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9" gracePeriod=15 Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.784432 4696 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 15:40:19 crc kubenswrapper[4696]: E0318 15:40:19.784689 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.784705 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 15:40:19 crc kubenswrapper[4696]: E0318 15:40:19.784713 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.784720 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:40:19 crc kubenswrapper[4696]: E0318 15:40:19.784726 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.784733 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:40:19 crc kubenswrapper[4696]: E0318 15:40:19.784743 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.784749 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 15:40:19 crc kubenswrapper[4696]: E0318 15:40:19.784758 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.784764 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 15:40:19 crc kubenswrapper[4696]: E0318 15:40:19.784776 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.784782 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 15:40:19 crc kubenswrapper[4696]: E0318 15:40:19.784791 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.784796 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:40:19 crc kubenswrapper[4696]: E0318 15:40:19.784806 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.784812 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:40:19 crc kubenswrapper[4696]: E0318 15:40:19.784819 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.784826 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:40:19 crc kubenswrapper[4696]: E0318 15:40:19.784833 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.784839 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.784937 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.784946 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.784953 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.784960 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.784970 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.784977 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.784985 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.784992 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.784999 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.817404 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.817469 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.817552 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.817590 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.817622 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.817681 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.817715 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.817821 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.818512 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 15:40:19 crc kubenswrapper[4696]: E0318 15:40:19.900450 4696 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/events\": dial tcp 38.102.83.192:6443: connect: connection refused" event="&Event{ObjectMeta:{route-controller-manager-5c7b7b5585-55dj4.189df9bd59810e9f openshift-route-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-route-controller-manager,Name:route-controller-manager-5c7b7b5585-55dj4,UID:83b8e906-da43-4ac7-8700-4689c9852803,APIVersion:v1,ResourceVersion:29869,FieldPath:spec.containers{route-controller-manager},},Reason:Created,Message:Created container route-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:40:19.899608735 +0000 UTC m=+262.905782941,LastTimestamp:2026-03-18 15:40:19.899608735 +0000 UTC m=+262.905782941,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.918782 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.918835 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.918859 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.918915 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.918926 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.918967 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.918987 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.919016 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.919031 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.919060 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.919102 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.919124 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.919123 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.919170 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.919173 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:40:19 crc kubenswrapper[4696]: I0318 15:40:19.919194 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.115533 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:40:20 crc kubenswrapper[4696]: W0318 15:40:20.150804 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-b5df429024e06cf133968b5cfee01fb373247d0d828a2a7a6d15e26b07977e82 WatchSource:0}: Error finding container b5df429024e06cf133968b5cfee01fb373247d0d828a2a7a6d15e26b07977e82: Status 404 returned error can't find the container with id b5df429024e06cf133968b5cfee01fb373247d0d828a2a7a6d15e26b07977e82 Mar 18 15:40:20 crc kubenswrapper[4696]: E0318 15:40:20.193249 4696 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: E0318 15:40:20.193720 4696 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: E0318 15:40:20.194040 4696 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: E0318 15:40:20.194466 4696 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: E0318 15:40:20.194751 4696 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.194779 4696 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 18 15:40:20 crc kubenswrapper[4696]: E0318 15:40:20.195125 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="200ms" Mar 18 15:40:20 crc kubenswrapper[4696]: E0318 15:40:20.396791 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="400ms" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.631066 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xd5v7" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.631644 4696 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.631995 4696 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.632614 4696 status_manager.go:851] "Failed to get status for pod" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" pod="openshift-marketplace/community-operators-xd5v7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xd5v7\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.682865 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xd5v7" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.683710 4696 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.684124 4696 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.684630 4696 status_manager.go:851] "Failed to get status for pod" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" pod="openshift-marketplace/community-operators-xd5v7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xd5v7\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.717903 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.719291 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.719935 4696 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca" exitCode=0 Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.719980 4696 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9" exitCode=0 Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.719990 4696 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0" exitCode=0 Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.719999 4696 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619" exitCode=2 Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.720029 4696 scope.go:117] "RemoveContainer" containerID="9021799b8ac4c0824c39fe31e6d0d32c410f95fff14772651bbf8f6231b68afe" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.722964 4696 generic.go:334] "Generic (PLEG): container finished" podID="70ccac16-ae70-4ba3-9edf-76707d5643b1" containerID="9f7017c0d7d72c03a3ee62e097c778897f75097c225aacd88493e25e432454bc" exitCode=0 Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.723080 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"70ccac16-ae70-4ba3-9edf-76707d5643b1","Type":"ContainerDied","Data":"9f7017c0d7d72c03a3ee62e097c778897f75097c225aacd88493e25e432454bc"} Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.724786 4696 status_manager.go:851] "Failed to get status for pod" podUID="70ccac16-ae70-4ba3-9edf-76707d5643b1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.725232 4696 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.725444 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b5df429024e06cf133968b5cfee01fb373247d0d828a2a7a6d15e26b07977e82"} Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.725484 4696 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.725761 4696 status_manager.go:851] "Failed to get status for pod" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" pod="openshift-marketplace/community-operators-xd5v7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xd5v7\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.727536 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" event={"ID":"83b8e906-da43-4ac7-8700-4689c9852803","Type":"ContainerStarted","Data":"cf7a1bbb1b94a2fcb9f3c8dc251f2a08fd18d28d44e0335531905442ebed0404"} Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.728139 4696 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.728213 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.728761 4696 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.729084 4696 status_manager.go:851] "Failed to get status for pod" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" pod="openshift-marketplace/community-operators-xd5v7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xd5v7\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.729432 4696 status_manager.go:851] "Failed to get status for pod" podUID="70ccac16-ae70-4ba3-9edf-76707d5643b1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.729634 4696 status_manager.go:851] "Failed to get status for pod" podUID="83b8e906-da43-4ac7-8700-4689c9852803" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c7b7b5585-55dj4\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: E0318 15:40:20.798438 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="800ms" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.799188 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8tkdr" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.799818 4696 status_manager.go:851] "Failed to get status for pod" podUID="70ccac16-ae70-4ba3-9edf-76707d5643b1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.800157 4696 status_manager.go:851] "Failed to get status for pod" podUID="83b8e906-da43-4ac7-8700-4689c9852803" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c7b7b5585-55dj4\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.800639 4696 status_manager.go:851] "Failed to get status for pod" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" pod="openshift-marketplace/certified-operators-8tkdr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-8tkdr\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.801069 4696 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.801572 4696 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.801927 4696 status_manager.go:851] "Failed to get status for pod" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" pod="openshift-marketplace/community-operators-xd5v7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xd5v7\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.838097 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8tkdr" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.840864 4696 status_manager.go:851] "Failed to get status for pod" podUID="70ccac16-ae70-4ba3-9edf-76707d5643b1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.842072 4696 status_manager.go:851] "Failed to get status for pod" podUID="83b8e906-da43-4ac7-8700-4689c9852803" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c7b7b5585-55dj4\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.842612 4696 status_manager.go:851] "Failed to get status for pod" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" pod="openshift-marketplace/certified-operators-8tkdr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-8tkdr\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.843201 4696 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.843769 4696 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:20 crc kubenswrapper[4696]: I0318 15:40:20.844262 4696 status_manager.go:851] "Failed to get status for pod" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" pod="openshift-marketplace/community-operators-xd5v7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xd5v7\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:21 crc kubenswrapper[4696]: I0318 15:40:21.032726 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sjnmp" Mar 18 15:40:21 crc kubenswrapper[4696]: I0318 15:40:21.033351 4696 status_manager.go:851] "Failed to get status for pod" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" pod="openshift-marketplace/certified-operators-8tkdr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-8tkdr\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:21 crc kubenswrapper[4696]: I0318 15:40:21.033928 4696 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:21 crc kubenswrapper[4696]: I0318 15:40:21.034601 4696 status_manager.go:851] "Failed to get status for pod" podUID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" pod="openshift-marketplace/community-operators-sjnmp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-sjnmp\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:21 crc kubenswrapper[4696]: I0318 15:40:21.035472 4696 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:21 crc kubenswrapper[4696]: I0318 15:40:21.035841 4696 status_manager.go:851] "Failed to get status for pod" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" pod="openshift-marketplace/community-operators-xd5v7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xd5v7\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:21 crc kubenswrapper[4696]: I0318 15:40:21.036388 4696 status_manager.go:851] "Failed to get status for pod" podUID="70ccac16-ae70-4ba3-9edf-76707d5643b1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:21 crc kubenswrapper[4696]: I0318 15:40:21.037090 4696 status_manager.go:851] "Failed to get status for pod" podUID="83b8e906-da43-4ac7-8700-4689c9852803" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c7b7b5585-55dj4\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:21 crc kubenswrapper[4696]: E0318 15:40:21.262492 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:40:21Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:40:21Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:40:21Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:40:21Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:491bd3a9c1f09106983d7c3b85f1c97c80dd582f8d1a10e6f6794bf430d7ac19\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8b28c7575f0f57c4dfc6bf61038ad06affeca0d25d7741b97abc25aa54b74e42\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746888156},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:86833de447f25d1d0fc15ed5460c5068cc48b18b78b8108304c5b5fd1dff04ab\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a41181d28dfacb78bea3690c390c965912300bc666e6e31a54a9382dd0329758\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1251896539},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:c3c12b935527854220bc939cf4b1e9ec5ea7b799b5530ba0609ec64f044c0a36\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:dd33dff955c181beea0d08607a8c766e68ceb902bff0a014f4416b7a4a86a7c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223856348},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:21 crc kubenswrapper[4696]: E0318 15:40:21.263202 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:21 crc kubenswrapper[4696]: E0318 15:40:21.263676 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:21 crc kubenswrapper[4696]: E0318 15:40:21.263990 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:21 crc kubenswrapper[4696]: E0318 15:40:21.264263 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:21 crc kubenswrapper[4696]: E0318 15:40:21.264293 4696 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:40:21 crc kubenswrapper[4696]: I0318 15:40:21.301966 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ldd6n" Mar 18 15:40:21 crc kubenswrapper[4696]: I0318 15:40:21.302998 4696 status_manager.go:851] "Failed to get status for pod" podUID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" pod="openshift-marketplace/community-operators-sjnmp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-sjnmp\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:21 crc kubenswrapper[4696]: I0318 15:40:21.303444 4696 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:21 crc kubenswrapper[4696]: I0318 15:40:21.303837 4696 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:21 crc kubenswrapper[4696]: I0318 15:40:21.304679 4696 status_manager.go:851] "Failed to get status for pod" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" pod="openshift-marketplace/community-operators-xd5v7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xd5v7\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:21 crc kubenswrapper[4696]: I0318 15:40:21.305045 4696 status_manager.go:851] "Failed to get status for pod" podUID="70ccac16-ae70-4ba3-9edf-76707d5643b1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:21 crc kubenswrapper[4696]: I0318 15:40:21.305494 4696 status_manager.go:851] "Failed to get status for pod" podUID="83b8e906-da43-4ac7-8700-4689c9852803" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c7b7b5585-55dj4\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:21 crc kubenswrapper[4696]: I0318 15:40:21.305835 4696 status_manager.go:851] "Failed to get status for pod" podUID="459c4b74-f710-4b3a-b053-8b0326b87cb5" pod="openshift-marketplace/certified-operators-ldd6n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ldd6n\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:21 crc kubenswrapper[4696]: I0318 15:40:21.306082 4696 status_manager.go:851] "Failed to get status for pod" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" pod="openshift-marketplace/certified-operators-8tkdr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-8tkdr\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:21 crc kubenswrapper[4696]: E0318 15:40:21.601011 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="1.6s" Mar 18 15:40:21 crc kubenswrapper[4696]: I0318 15:40:21.728147 4696 patch_prober.go:28] interesting pod/route-controller-manager-5c7b7b5585-55dj4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:40:21 crc kubenswrapper[4696]: I0318 15:40:21.728216 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" podUID="83b8e906-da43-4ac7-8700-4689c9852803" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 15:40:21 crc kubenswrapper[4696]: I0318 15:40:21.738884 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 15:40:21 crc kubenswrapper[4696]: I0318 15:40:21.741709 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5ff856bdfe7f41495a16e373c3b1f37ec16645096184a3114efe97d8e07027f5"} Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.181958 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.183034 4696 status_manager.go:851] "Failed to get status for pod" podUID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" pod="openshift-marketplace/community-operators-sjnmp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-sjnmp\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.183777 4696 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.184189 4696 status_manager.go:851] "Failed to get status for pod" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" pod="openshift-marketplace/community-operators-xd5v7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xd5v7\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.184503 4696 status_manager.go:851] "Failed to get status for pod" podUID="70ccac16-ae70-4ba3-9edf-76707d5643b1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.184850 4696 status_manager.go:851] "Failed to get status for pod" podUID="83b8e906-da43-4ac7-8700-4689c9852803" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c7b7b5585-55dj4\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.185196 4696 status_manager.go:851] "Failed to get status for pod" podUID="459c4b74-f710-4b3a-b053-8b0326b87cb5" pod="openshift-marketplace/certified-operators-ldd6n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ldd6n\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.185559 4696 status_manager.go:851] "Failed to get status for pod" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" pod="openshift-marketplace/certified-operators-8tkdr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-8tkdr\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.372124 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70ccac16-ae70-4ba3-9edf-76707d5643b1-kube-api-access\") pod \"70ccac16-ae70-4ba3-9edf-76707d5643b1\" (UID: \"70ccac16-ae70-4ba3-9edf-76707d5643b1\") " Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.372240 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/70ccac16-ae70-4ba3-9edf-76707d5643b1-var-lock\") pod \"70ccac16-ae70-4ba3-9edf-76707d5643b1\" (UID: \"70ccac16-ae70-4ba3-9edf-76707d5643b1\") " Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.372267 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70ccac16-ae70-4ba3-9edf-76707d5643b1-kubelet-dir\") pod \"70ccac16-ae70-4ba3-9edf-76707d5643b1\" (UID: \"70ccac16-ae70-4ba3-9edf-76707d5643b1\") " Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.372582 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/70ccac16-ae70-4ba3-9edf-76707d5643b1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "70ccac16-ae70-4ba3-9edf-76707d5643b1" (UID: "70ccac16-ae70-4ba3-9edf-76707d5643b1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.372627 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/70ccac16-ae70-4ba3-9edf-76707d5643b1-var-lock" (OuterVolumeSpecName: "var-lock") pod "70ccac16-ae70-4ba3-9edf-76707d5643b1" (UID: "70ccac16-ae70-4ba3-9edf-76707d5643b1"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.379530 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70ccac16-ae70-4ba3-9edf-76707d5643b1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "70ccac16-ae70-4ba3-9edf-76707d5643b1" (UID: "70ccac16-ae70-4ba3-9edf-76707d5643b1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.473719 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70ccac16-ae70-4ba3-9edf-76707d5643b1-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.473765 4696 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/70ccac16-ae70-4ba3-9edf-76707d5643b1-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.473776 4696 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70ccac16-ae70-4ba3-9edf-76707d5643b1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.683965 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.685463 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.686073 4696 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.686921 4696 status_manager.go:851] "Failed to get status for pod" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" pod="openshift-marketplace/community-operators-xd5v7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xd5v7\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.687223 4696 status_manager.go:851] "Failed to get status for pod" podUID="70ccac16-ae70-4ba3-9edf-76707d5643b1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.687585 4696 status_manager.go:851] "Failed to get status for pod" podUID="83b8e906-da43-4ac7-8700-4689c9852803" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c7b7b5585-55dj4\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.688059 4696 status_manager.go:851] "Failed to get status for pod" podUID="459c4b74-f710-4b3a-b053-8b0326b87cb5" pod="openshift-marketplace/certified-operators-ldd6n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ldd6n\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.688379 4696 status_manager.go:851] "Failed to get status for pod" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" pod="openshift-marketplace/certified-operators-8tkdr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-8tkdr\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.688787 4696 status_manager.go:851] "Failed to get status for pod" podUID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" pod="openshift-marketplace/community-operators-sjnmp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-sjnmp\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.689377 4696 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.742371 4696 patch_prober.go:28] interesting pod/route-controller-manager-5c7b7b5585-55dj4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.742430 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" podUID="83b8e906-da43-4ac7-8700-4689c9852803" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.749658 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"70ccac16-ae70-4ba3-9edf-76707d5643b1","Type":"ContainerDied","Data":"0cac7636bea9f2756f3b13c630ab1e37b88bba8154da44c5e19f76236b435440"} Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.749704 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.749746 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cac7636bea9f2756f3b13c630ab1e37b88bba8154da44c5e19f76236b435440" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.756293 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.757624 4696 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7" exitCode=0 Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.757741 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.757779 4696 scope.go:117] "RemoveContainer" containerID="755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.758608 4696 status_manager.go:851] "Failed to get status for pod" podUID="83b8e906-da43-4ac7-8700-4689c9852803" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c7b7b5585-55dj4\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.758906 4696 status_manager.go:851] "Failed to get status for pod" podUID="459c4b74-f710-4b3a-b053-8b0326b87cb5" pod="openshift-marketplace/certified-operators-ldd6n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ldd6n\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.759227 4696 status_manager.go:851] "Failed to get status for pod" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" pod="openshift-marketplace/certified-operators-8tkdr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-8tkdr\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.759495 4696 status_manager.go:851] "Failed to get status for pod" podUID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" pod="openshift-marketplace/community-operators-sjnmp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-sjnmp\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.759798 4696 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.760703 4696 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.761130 4696 status_manager.go:851] "Failed to get status for pod" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" pod="openshift-marketplace/community-operators-xd5v7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xd5v7\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.761383 4696 status_manager.go:851] "Failed to get status for pod" podUID="70ccac16-ae70-4ba3-9edf-76707d5643b1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.774973 4696 scope.go:117] "RemoveContainer" containerID="db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.775112 4696 status_manager.go:851] "Failed to get status for pod" podUID="70ccac16-ae70-4ba3-9edf-76707d5643b1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.775403 4696 status_manager.go:851] "Failed to get status for pod" podUID="83b8e906-da43-4ac7-8700-4689c9852803" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c7b7b5585-55dj4\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.777327 4696 status_manager.go:851] "Failed to get status for pod" podUID="459c4b74-f710-4b3a-b053-8b0326b87cb5" pod="openshift-marketplace/certified-operators-ldd6n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ldd6n\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.778223 4696 status_manager.go:851] "Failed to get status for pod" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" pod="openshift-marketplace/certified-operators-8tkdr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-8tkdr\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.778597 4696 status_manager.go:851] "Failed to get status for pod" podUID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" pod="openshift-marketplace/community-operators-sjnmp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-sjnmp\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.778859 4696 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.779068 4696 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.779272 4696 status_manager.go:851] "Failed to get status for pod" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" pod="openshift-marketplace/community-operators-xd5v7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xd5v7\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.796072 4696 scope.go:117] "RemoveContainer" containerID="d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.810205 4696 scope.go:117] "RemoveContainer" containerID="8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.825412 4696 scope.go:117] "RemoveContainer" containerID="9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.841658 4696 scope.go:117] "RemoveContainer" containerID="0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.862479 4696 scope.go:117] "RemoveContainer" containerID="755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca" Mar 18 15:40:22 crc kubenswrapper[4696]: E0318 15:40:22.863311 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\": container with ID starting with 755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca not found: ID does not exist" containerID="755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.863409 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca"} err="failed to get container status \"755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\": rpc error: code = NotFound desc = could not find container \"755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca\": container with ID starting with 755e5bda3de7c6242ada198419120a2142e36b3940b08fc783dab961cf6bceca not found: ID does not exist" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.863442 4696 scope.go:117] "RemoveContainer" containerID="db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9" Mar 18 15:40:22 crc kubenswrapper[4696]: E0318 15:40:22.863874 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\": container with ID starting with db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9 not found: ID does not exist" containerID="db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.863924 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9"} err="failed to get container status \"db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\": rpc error: code = NotFound desc = could not find container \"db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9\": container with ID starting with db4298156d0975c866fb58c9ce32f19bfe0a44ee7bb420a30be9d117f4d268c9 not found: ID does not exist" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.863956 4696 scope.go:117] "RemoveContainer" containerID="d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0" Mar 18 15:40:22 crc kubenswrapper[4696]: E0318 15:40:22.864263 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\": container with ID starting with d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0 not found: ID does not exist" containerID="d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.864293 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0"} err="failed to get container status \"d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\": rpc error: code = NotFound desc = could not find container \"d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0\": container with ID starting with d4f32e19c4d24c99cdcf6be647407b1f76afbb4988c8fa949275eef3297de5f0 not found: ID does not exist" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.864309 4696 scope.go:117] "RemoveContainer" containerID="8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619" Mar 18 15:40:22 crc kubenswrapper[4696]: E0318 15:40:22.864726 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\": container with ID starting with 8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619 not found: ID does not exist" containerID="8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.864751 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619"} err="failed to get container status \"8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\": rpc error: code = NotFound desc = could not find container \"8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619\": container with ID starting with 8260b5b462c19bc8e5394bc7e603b84d8bc2b356d06544207f379efa993ee619 not found: ID does not exist" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.864768 4696 scope.go:117] "RemoveContainer" containerID="9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7" Mar 18 15:40:22 crc kubenswrapper[4696]: E0318 15:40:22.865155 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\": container with ID starting with 9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7 not found: ID does not exist" containerID="9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.865195 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7"} err="failed to get container status \"9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\": rpc error: code = NotFound desc = could not find container \"9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7\": container with ID starting with 9e49d8fb768adc79af422105f577705b1dd3e1b1fc4978ecdb0e51a84e2279a7 not found: ID does not exist" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.865222 4696 scope.go:117] "RemoveContainer" containerID="0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2" Mar 18 15:40:22 crc kubenswrapper[4696]: E0318 15:40:22.865623 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\": container with ID starting with 0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2 not found: ID does not exist" containerID="0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.865648 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2"} err="failed to get container status \"0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\": rpc error: code = NotFound desc = could not find container \"0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2\": container with ID starting with 0283cc3dbd9624ebcdd3142610d0b6373320a457dd076f3048de0464503550e2 not found: ID does not exist" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.880106 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.880227 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.880275 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.880290 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.880371 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.880460 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.880668 4696 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.880680 4696 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:22 crc kubenswrapper[4696]: I0318 15:40:22.880688 4696 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:23 crc kubenswrapper[4696]: I0318 15:40:23.087895 4696 status_manager.go:851] "Failed to get status for pod" podUID="70ccac16-ae70-4ba3-9edf-76707d5643b1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:23 crc kubenswrapper[4696]: I0318 15:40:23.088428 4696 status_manager.go:851] "Failed to get status for pod" podUID="83b8e906-da43-4ac7-8700-4689c9852803" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c7b7b5585-55dj4\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:23 crc kubenswrapper[4696]: I0318 15:40:23.090034 4696 status_manager.go:851] "Failed to get status for pod" podUID="459c4b74-f710-4b3a-b053-8b0326b87cb5" pod="openshift-marketplace/certified-operators-ldd6n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ldd6n\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:23 crc kubenswrapper[4696]: I0318 15:40:23.091221 4696 status_manager.go:851] "Failed to get status for pod" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" pod="openshift-marketplace/certified-operators-8tkdr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-8tkdr\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:23 crc kubenswrapper[4696]: I0318 15:40:23.091502 4696 status_manager.go:851] "Failed to get status for pod" podUID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" pod="openshift-marketplace/community-operators-sjnmp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-sjnmp\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:23 crc kubenswrapper[4696]: I0318 15:40:23.091805 4696 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:23 crc kubenswrapper[4696]: I0318 15:40:23.092130 4696 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:23 crc kubenswrapper[4696]: I0318 15:40:23.092636 4696 status_manager.go:851] "Failed to get status for pod" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" pod="openshift-marketplace/community-operators-xd5v7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xd5v7\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:23 crc kubenswrapper[4696]: E0318 15:40:23.201726 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="3.2s" Mar 18 15:40:23 crc kubenswrapper[4696]: I0318 15:40:23.603437 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 18 15:40:24 crc kubenswrapper[4696]: I0318 15:40:24.270187 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q82hl" Mar 18 15:40:24 crc kubenswrapper[4696]: I0318 15:40:24.271226 4696 status_manager.go:851] "Failed to get status for pod" podUID="cfaa1769-2588-440e-9757-ded58dcb0ac3" pod="openshift-marketplace/redhat-operators-q82hl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q82hl\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:24 crc kubenswrapper[4696]: I0318 15:40:24.271501 4696 status_manager.go:851] "Failed to get status for pod" podUID="70ccac16-ae70-4ba3-9edf-76707d5643b1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:24 crc kubenswrapper[4696]: I0318 15:40:24.271683 4696 status_manager.go:851] "Failed to get status for pod" podUID="83b8e906-da43-4ac7-8700-4689c9852803" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c7b7b5585-55dj4\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:24 crc kubenswrapper[4696]: I0318 15:40:24.271820 4696 status_manager.go:851] "Failed to get status for pod" podUID="459c4b74-f710-4b3a-b053-8b0326b87cb5" pod="openshift-marketplace/certified-operators-ldd6n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ldd6n\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:24 crc kubenswrapper[4696]: I0318 15:40:24.271981 4696 status_manager.go:851] "Failed to get status for pod" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" pod="openshift-marketplace/certified-operators-8tkdr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-8tkdr\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:24 crc kubenswrapper[4696]: I0318 15:40:24.272115 4696 status_manager.go:851] "Failed to get status for pod" podUID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" pod="openshift-marketplace/community-operators-sjnmp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-sjnmp\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:24 crc kubenswrapper[4696]: I0318 15:40:24.272414 4696 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:24 crc kubenswrapper[4696]: I0318 15:40:24.273172 4696 status_manager.go:851] "Failed to get status for pod" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" pod="openshift-marketplace/community-operators-xd5v7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xd5v7\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:24 crc kubenswrapper[4696]: I0318 15:40:24.350574 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q82hl" Mar 18 15:40:24 crc kubenswrapper[4696]: I0318 15:40:24.351256 4696 status_manager.go:851] "Failed to get status for pod" podUID="83b8e906-da43-4ac7-8700-4689c9852803" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c7b7b5585-55dj4\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:24 crc kubenswrapper[4696]: I0318 15:40:24.351798 4696 status_manager.go:851] "Failed to get status for pod" podUID="459c4b74-f710-4b3a-b053-8b0326b87cb5" pod="openshift-marketplace/certified-operators-ldd6n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ldd6n\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:24 crc kubenswrapper[4696]: I0318 15:40:24.352108 4696 status_manager.go:851] "Failed to get status for pod" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" pod="openshift-marketplace/certified-operators-8tkdr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-8tkdr\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:24 crc kubenswrapper[4696]: I0318 15:40:24.352501 4696 status_manager.go:851] "Failed to get status for pod" podUID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" pod="openshift-marketplace/community-operators-sjnmp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-sjnmp\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:24 crc kubenswrapper[4696]: I0318 15:40:24.352733 4696 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:24 crc kubenswrapper[4696]: I0318 15:40:24.352933 4696 status_manager.go:851] "Failed to get status for pod" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" pod="openshift-marketplace/community-operators-xd5v7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xd5v7\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:24 crc kubenswrapper[4696]: I0318 15:40:24.353141 4696 status_manager.go:851] "Failed to get status for pod" podUID="cfaa1769-2588-440e-9757-ded58dcb0ac3" pod="openshift-marketplace/redhat-operators-q82hl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q82hl\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:24 crc kubenswrapper[4696]: I0318 15:40:24.353356 4696 status_manager.go:851] "Failed to get status for pod" podUID="70ccac16-ae70-4ba3-9edf-76707d5643b1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:26 crc kubenswrapper[4696]: E0318 15:40:26.402472 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="6.4s" Mar 18 15:40:27 crc kubenswrapper[4696]: I0318 15:40:27.601087 4696 status_manager.go:851] "Failed to get status for pod" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" pod="openshift-marketplace/community-operators-xd5v7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xd5v7\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:27 crc kubenswrapper[4696]: I0318 15:40:27.602512 4696 status_manager.go:851] "Failed to get status for pod" podUID="cfaa1769-2588-440e-9757-ded58dcb0ac3" pod="openshift-marketplace/redhat-operators-q82hl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q82hl\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:27 crc kubenswrapper[4696]: I0318 15:40:27.603108 4696 status_manager.go:851] "Failed to get status for pod" podUID="70ccac16-ae70-4ba3-9edf-76707d5643b1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:27 crc kubenswrapper[4696]: I0318 15:40:27.603636 4696 status_manager.go:851] "Failed to get status for pod" podUID="83b8e906-da43-4ac7-8700-4689c9852803" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c7b7b5585-55dj4\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:27 crc kubenswrapper[4696]: I0318 15:40:27.604339 4696 status_manager.go:851] "Failed to get status for pod" podUID="459c4b74-f710-4b3a-b053-8b0326b87cb5" pod="openshift-marketplace/certified-operators-ldd6n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ldd6n\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:27 crc kubenswrapper[4696]: I0318 15:40:27.605187 4696 status_manager.go:851] "Failed to get status for pod" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" pod="openshift-marketplace/certified-operators-8tkdr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-8tkdr\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:27 crc kubenswrapper[4696]: I0318 15:40:27.606030 4696 status_manager.go:851] "Failed to get status for pod" podUID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" pod="openshift-marketplace/community-operators-sjnmp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-sjnmp\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:27 crc kubenswrapper[4696]: I0318 15:40:27.606353 4696 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:27 crc kubenswrapper[4696]: E0318 15:40:27.674424 4696 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-g894v" volumeName="registry-storage" Mar 18 15:40:29 crc kubenswrapper[4696]: E0318 15:40:29.301307 4696 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/events\": dial tcp 38.102.83.192:6443: connect: connection refused" event="&Event{ObjectMeta:{route-controller-manager-5c7b7b5585-55dj4.189df9bd59810e9f openshift-route-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-route-controller-manager,Name:route-controller-manager-5c7b7b5585-55dj4,UID:83b8e906-da43-4ac7-8700-4689c9852803,APIVersion:v1,ResourceVersion:29869,FieldPath:spec.containers{route-controller-manager},},Reason:Created,Message:Created container route-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:40:19.899608735 +0000 UTC m=+262.905782941,LastTimestamp:2026-03-18 15:40:19.899608735 +0000 UTC m=+262.905782941,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:40:30 crc kubenswrapper[4696]: I0318 15:40:30.166306 4696 patch_prober.go:28] interesting pod/route-controller-manager-5c7b7b5585-55dj4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:40:30 crc kubenswrapper[4696]: I0318 15:40:30.166450 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" podUID="83b8e906-da43-4ac7-8700-4689c9852803" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 15:40:31 crc kubenswrapper[4696]: E0318 15:40:31.321245 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:40:31Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:40:31Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:40:31Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:40:31Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:491bd3a9c1f09106983d7c3b85f1c97c80dd582f8d1a10e6f6794bf430d7ac19\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8b28c7575f0f57c4dfc6bf61038ad06affeca0d25d7741b97abc25aa54b74e42\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746888156},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:86833de447f25d1d0fc15ed5460c5068cc48b18b78b8108304c5b5fd1dff04ab\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a41181d28dfacb78bea3690c390c965912300bc666e6e31a54a9382dd0329758\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1251896539},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:c3c12b935527854220bc939cf4b1e9ec5ea7b799b5530ba0609ec64f044c0a36\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:dd33dff955c181beea0d08607a8c766e68ceb902bff0a014f4416b7a4a86a7c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223856348},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:31 crc kubenswrapper[4696]: E0318 15:40:31.321968 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:31 crc kubenswrapper[4696]: E0318 15:40:31.322463 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:31 crc kubenswrapper[4696]: E0318 15:40:31.322720 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:31 crc kubenswrapper[4696]: E0318 15:40:31.322943 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:31 crc kubenswrapper[4696]: E0318 15:40:31.322961 4696 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:40:31 crc kubenswrapper[4696]: I0318 15:40:31.597974 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:40:31 crc kubenswrapper[4696]: I0318 15:40:31.598940 4696 status_manager.go:851] "Failed to get status for pod" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" pod="openshift-marketplace/community-operators-xd5v7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xd5v7\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:31 crc kubenswrapper[4696]: I0318 15:40:31.599900 4696 status_manager.go:851] "Failed to get status for pod" podUID="cfaa1769-2588-440e-9757-ded58dcb0ac3" pod="openshift-marketplace/redhat-operators-q82hl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q82hl\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:31 crc kubenswrapper[4696]: I0318 15:40:31.600494 4696 status_manager.go:851] "Failed to get status for pod" podUID="70ccac16-ae70-4ba3-9edf-76707d5643b1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:31 crc kubenswrapper[4696]: I0318 15:40:31.600723 4696 status_manager.go:851] "Failed to get status for pod" podUID="83b8e906-da43-4ac7-8700-4689c9852803" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c7b7b5585-55dj4\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:31 crc kubenswrapper[4696]: I0318 15:40:31.600901 4696 status_manager.go:851] "Failed to get status for pod" podUID="459c4b74-f710-4b3a-b053-8b0326b87cb5" pod="openshift-marketplace/certified-operators-ldd6n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ldd6n\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:31 crc kubenswrapper[4696]: I0318 15:40:31.601085 4696 status_manager.go:851] "Failed to get status for pod" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" pod="openshift-marketplace/certified-operators-8tkdr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-8tkdr\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:31 crc kubenswrapper[4696]: I0318 15:40:31.601243 4696 status_manager.go:851] "Failed to get status for pod" podUID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" pod="openshift-marketplace/community-operators-sjnmp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-sjnmp\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:31 crc kubenswrapper[4696]: I0318 15:40:31.601406 4696 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:31 crc kubenswrapper[4696]: I0318 15:40:31.616309 4696 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="64ea155e-f1fc-4919-94e1-249625f0fefb" Mar 18 15:40:31 crc kubenswrapper[4696]: I0318 15:40:31.616364 4696 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="64ea155e-f1fc-4919-94e1-249625f0fefb" Mar 18 15:40:31 crc kubenswrapper[4696]: E0318 15:40:31.616870 4696 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:40:31 crc kubenswrapper[4696]: I0318 15:40:31.617468 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:40:31 crc kubenswrapper[4696]: I0318 15:40:31.825551 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"49898ce0a06e9c99cc775eb1af8e34a9a2b815f2904b299a93f3bf9c1de1c0a2"} Mar 18 15:40:32 crc kubenswrapper[4696]: E0318 15:40:32.804005 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="7s" Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.834727 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.835424 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.835552 4696 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="a7b4da8be7e22b17201dd8593f7e1b3943344f19bef3957c76cb18f2c7427335" exitCode=1 Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.835629 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"a7b4da8be7e22b17201dd8593f7e1b3943344f19bef3957c76cb18f2c7427335"} Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.837025 4696 status_manager.go:851] "Failed to get status for pod" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" pod="openshift-marketplace/certified-operators-8tkdr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-8tkdr\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.837289 4696 status_manager.go:851] "Failed to get status for pod" podUID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" pod="openshift-marketplace/community-operators-sjnmp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-sjnmp\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.837590 4696 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="2a97d2b60d441afdc6cc239b7e83cd43b07bb6b868cbb6f6bea6d60d26d24dbd" exitCode=0 Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.837635 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"2a97d2b60d441afdc6cc239b7e83cd43b07bb6b868cbb6f6bea6d60d26d24dbd"} Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.837723 4696 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.837867 4696 scope.go:117] "RemoveContainer" containerID="a7b4da8be7e22b17201dd8593f7e1b3943344f19bef3957c76cb18f2c7427335" Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.837893 4696 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="64ea155e-f1fc-4919-94e1-249625f0fefb" Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.837994 4696 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="64ea155e-f1fc-4919-94e1-249625f0fefb" Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.838265 4696 status_manager.go:851] "Failed to get status for pod" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" pod="openshift-marketplace/community-operators-xd5v7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xd5v7\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.838498 4696 status_manager.go:851] "Failed to get status for pod" podUID="cfaa1769-2588-440e-9757-ded58dcb0ac3" pod="openshift-marketplace/redhat-operators-q82hl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q82hl\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:32 crc kubenswrapper[4696]: E0318 15:40:32.838584 4696 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.838790 4696 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.839043 4696 status_manager.go:851] "Failed to get status for pod" podUID="70ccac16-ae70-4ba3-9edf-76707d5643b1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.839593 4696 status_manager.go:851] "Failed to get status for pod" podUID="83b8e906-da43-4ac7-8700-4689c9852803" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c7b7b5585-55dj4\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.840469 4696 status_manager.go:851] "Failed to get status for pod" podUID="459c4b74-f710-4b3a-b053-8b0326b87cb5" pod="openshift-marketplace/certified-operators-ldd6n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ldd6n\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.841289 4696 status_manager.go:851] "Failed to get status for pod" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" pod="openshift-marketplace/community-operators-xd5v7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xd5v7\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.841934 4696 status_manager.go:851] "Failed to get status for pod" podUID="cfaa1769-2588-440e-9757-ded58dcb0ac3" pod="openshift-marketplace/redhat-operators-q82hl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q82hl\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.842269 4696 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.842567 4696 status_manager.go:851] "Failed to get status for pod" podUID="70ccac16-ae70-4ba3-9edf-76707d5643b1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.842870 4696 status_manager.go:851] "Failed to get status for pod" podUID="83b8e906-da43-4ac7-8700-4689c9852803" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c7b7b5585-55dj4\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.843123 4696 status_manager.go:851] "Failed to get status for pod" podUID="459c4b74-f710-4b3a-b053-8b0326b87cb5" pod="openshift-marketplace/certified-operators-ldd6n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ldd6n\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.843609 4696 status_manager.go:851] "Failed to get status for pod" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" pod="openshift-marketplace/certified-operators-8tkdr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-8tkdr\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.844014 4696 status_manager.go:851] "Failed to get status for pod" podUID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" pod="openshift-marketplace/community-operators-sjnmp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-sjnmp\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:32 crc kubenswrapper[4696]: I0318 15:40:32.844444 4696 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 18 15:40:33 crc kubenswrapper[4696]: I0318 15:40:33.857993 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a03c2d1ecde998878a6136e52748afe7915b3114b557abac6fd7a3c183b643c3"} Mar 18 15:40:33 crc kubenswrapper[4696]: I0318 15:40:33.858491 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0c0d4b923af084348946762dbe7ee11d0dc98e03f5aa40e7cca0077ec2c3d758"} Mar 18 15:40:33 crc kubenswrapper[4696]: I0318 15:40:33.858512 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e0e2596106198f8c62e526cd97270af8d7d4257114a220ffb5304b105f115e2d"} Mar 18 15:40:33 crc kubenswrapper[4696]: I0318 15:40:33.858539 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3064c6703039fb1ecdc9a9e12f860bcd6d4e14091219823a8af3129f84ad27cf"} Mar 18 15:40:33 crc kubenswrapper[4696]: I0318 15:40:33.865882 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 15:40:33 crc kubenswrapper[4696]: I0318 15:40:33.866644 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 15:40:33 crc kubenswrapper[4696]: I0318 15:40:33.866699 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"eae272ea3ac62c225829115f555293406d1f23d8bea393d01c0b3493c9153075"} Mar 18 15:40:34 crc kubenswrapper[4696]: I0318 15:40:34.879890 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e7a0fd882d8c7b82f429fcb9d805778529651b38ac8a38bbe99caae7c636dcb0"} Mar 18 15:40:34 crc kubenswrapper[4696]: I0318 15:40:34.880440 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:40:34 crc kubenswrapper[4696]: I0318 15:40:34.880468 4696 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="64ea155e-f1fc-4919-94e1-249625f0fefb" Mar 18 15:40:34 crc kubenswrapper[4696]: I0318 15:40:34.880504 4696 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="64ea155e-f1fc-4919-94e1-249625f0fefb" Mar 18 15:40:36 crc kubenswrapper[4696]: I0318 15:40:36.617614 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:40:36 crc kubenswrapper[4696]: I0318 15:40:36.618662 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:40:36 crc kubenswrapper[4696]: I0318 15:40:36.623506 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:40:37 crc kubenswrapper[4696]: I0318 15:40:37.361941 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:40:39 crc kubenswrapper[4696]: I0318 15:40:39.888568 4696 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:40:39 crc kubenswrapper[4696]: I0318 15:40:39.907006 4696 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="64ea155e-f1fc-4919-94e1-249625f0fefb" Mar 18 15:40:39 crc kubenswrapper[4696]: I0318 15:40:39.907039 4696 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="64ea155e-f1fc-4919-94e1-249625f0fefb" Mar 18 15:40:39 crc kubenswrapper[4696]: I0318 15:40:39.910980 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:40:39 crc kubenswrapper[4696]: I0318 15:40:39.913125 4696 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="51884505-8f2e-4c15-980f-01ebf7791848" Mar 18 15:40:40 crc kubenswrapper[4696]: I0318 15:40:40.167148 4696 patch_prober.go:28] interesting pod/route-controller-manager-5c7b7b5585-55dj4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:40:40 crc kubenswrapper[4696]: I0318 15:40:40.167221 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" podUID="83b8e906-da43-4ac7-8700-4689c9852803" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 15:40:40 crc kubenswrapper[4696]: I0318 15:40:40.911820 4696 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="64ea155e-f1fc-4919-94e1-249625f0fefb" Mar 18 15:40:40 crc kubenswrapper[4696]: I0318 15:40:40.912150 4696 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="64ea155e-f1fc-4919-94e1-249625f0fefb" Mar 18 15:40:42 crc kubenswrapper[4696]: I0318 15:40:42.091380 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:40:42 crc kubenswrapper[4696]: I0318 15:40:42.094905 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:40:42 crc kubenswrapper[4696]: I0318 15:40:42.184217 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:40:42 crc kubenswrapper[4696]: I0318 15:40:42.184503 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:40:42 crc kubenswrapper[4696]: I0318 15:40:42.184664 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 15:40:42 crc kubenswrapper[4696]: I0318 15:40:42.185244 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05"} pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 15:40:42 crc kubenswrapper[4696]: I0318 15:40:42.185392 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" containerID="cri-o://cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05" gracePeriod=600 Mar 18 15:40:42 crc kubenswrapper[4696]: I0318 15:40:42.922651 4696 generic.go:334] "Generic (PLEG): container finished" podID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerID="cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05" exitCode=0 Mar 18 15:40:42 crc kubenswrapper[4696]: I0318 15:40:42.922794 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerDied","Data":"cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05"} Mar 18 15:40:42 crc kubenswrapper[4696]: I0318 15:40:42.923415 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerStarted","Data":"a757e73374a5d2719000efd9790c0f7bc244089cee9596be88ce0ad60115e2a2"} Mar 18 15:40:47 crc kubenswrapper[4696]: I0318 15:40:47.365179 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:40:47 crc kubenswrapper[4696]: I0318 15:40:47.634266 4696 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="51884505-8f2e-4c15-980f-01ebf7791848" Mar 18 15:40:50 crc kubenswrapper[4696]: I0318 15:40:50.166616 4696 patch_prober.go:28] interesting pod/route-controller-manager-5c7b7b5585-55dj4 container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:40:50 crc kubenswrapper[4696]: I0318 15:40:50.166662 4696 patch_prober.go:28] interesting pod/route-controller-manager-5c7b7b5585-55dj4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:40:50 crc kubenswrapper[4696]: I0318 15:40:50.167690 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" podUID="83b8e906-da43-4ac7-8700-4689c9852803" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 15:40:50 crc kubenswrapper[4696]: I0318 15:40:50.167772 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" podUID="83b8e906-da43-4ac7-8700-4689c9852803" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 15:40:50 crc kubenswrapper[4696]: I0318 15:40:50.514197 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 15:40:50 crc kubenswrapper[4696]: I0318 15:40:50.779716 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 15:40:50 crc kubenswrapper[4696]: I0318 15:40:50.982727 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-5c7b7b5585-55dj4_83b8e906-da43-4ac7-8700-4689c9852803/route-controller-manager/0.log" Mar 18 15:40:50 crc kubenswrapper[4696]: I0318 15:40:50.983099 4696 generic.go:334] "Generic (PLEG): container finished" podID="83b8e906-da43-4ac7-8700-4689c9852803" containerID="cf7a1bbb1b94a2fcb9f3c8dc251f2a08fd18d28d44e0335531905442ebed0404" exitCode=255 Mar 18 15:40:50 crc kubenswrapper[4696]: I0318 15:40:50.983137 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" event={"ID":"83b8e906-da43-4ac7-8700-4689c9852803","Type":"ContainerDied","Data":"cf7a1bbb1b94a2fcb9f3c8dc251f2a08fd18d28d44e0335531905442ebed0404"} Mar 18 15:40:50 crc kubenswrapper[4696]: I0318 15:40:50.983942 4696 scope.go:117] "RemoveContainer" containerID="cf7a1bbb1b94a2fcb9f3c8dc251f2a08fd18d28d44e0335531905442ebed0404" Mar 18 15:40:51 crc kubenswrapper[4696]: I0318 15:40:51.017494 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 15:40:51 crc kubenswrapper[4696]: I0318 15:40:51.108835 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 15:40:51 crc kubenswrapper[4696]: I0318 15:40:51.195634 4696 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 15:40:51 crc kubenswrapper[4696]: I0318 15:40:51.196445 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=32.196418341 podStartE2EDuration="32.196418341s" podCreationTimestamp="2026-03-18 15:40:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:39.383643517 +0000 UTC m=+282.389817773" watchObservedRunningTime="2026-03-18 15:40:51.196418341 +0000 UTC m=+294.202592557" Mar 18 15:40:51 crc kubenswrapper[4696]: I0318 15:40:51.202618 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 15:40:51 crc kubenswrapper[4696]: I0318 15:40:51.202740 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 15:40:51 crc kubenswrapper[4696]: I0318 15:40:51.206953 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:40:51 crc kubenswrapper[4696]: I0318 15:40:51.228257 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=12.228233826 podStartE2EDuration="12.228233826s" podCreationTimestamp="2026-03-18 15:40:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:51.224463547 +0000 UTC m=+294.230637753" watchObservedRunningTime="2026-03-18 15:40:51.228233826 +0000 UTC m=+294.234408032" Mar 18 15:40:51 crc kubenswrapper[4696]: I0318 15:40:51.331881 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 15:40:51 crc kubenswrapper[4696]: I0318 15:40:51.701419 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 15:40:51 crc kubenswrapper[4696]: I0318 15:40:51.723312 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 15:40:51 crc kubenswrapper[4696]: I0318 15:40:51.762775 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 15:40:51 crc kubenswrapper[4696]: I0318 15:40:51.993610 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-5c7b7b5585-55dj4_83b8e906-da43-4ac7-8700-4689c9852803/route-controller-manager/0.log" Mar 18 15:40:51 crc kubenswrapper[4696]: I0318 15:40:51.994407 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" event={"ID":"83b8e906-da43-4ac7-8700-4689c9852803","Type":"ContainerStarted","Data":"bef4f50e22b68d68eb13c7acbc775d1d7d3f783d4dec3d1c5078215930718091"} Mar 18 15:40:51 crc kubenswrapper[4696]: I0318 15:40:51.995236 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" Mar 18 15:40:52 crc kubenswrapper[4696]: I0318 15:40:52.010184 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" podStartSLOduration=37.010168836 podStartE2EDuration="37.010168836s" podCreationTimestamp="2026-03-18 15:40:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:39.493092642 +0000 UTC m=+282.499266848" watchObservedRunningTime="2026-03-18 15:40:52.010168836 +0000 UTC m=+295.016343032" Mar 18 15:40:52 crc kubenswrapper[4696]: I0318 15:40:52.014183 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 15:40:52 crc kubenswrapper[4696]: I0318 15:40:52.036204 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 15:40:52 crc kubenswrapper[4696]: I0318 15:40:52.554861 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 15:40:52 crc kubenswrapper[4696]: I0318 15:40:52.592999 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 15:40:52 crc kubenswrapper[4696]: I0318 15:40:52.594307 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 15:40:52 crc kubenswrapper[4696]: I0318 15:40:52.828953 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 15:40:52 crc kubenswrapper[4696]: I0318 15:40:52.919878 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 15:40:52 crc kubenswrapper[4696]: I0318 15:40:52.995183 4696 patch_prober.go:28] interesting pod/route-controller-manager-5c7b7b5585-55dj4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:40:52 crc kubenswrapper[4696]: I0318 15:40:52.995244 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" podUID="83b8e906-da43-4ac7-8700-4689c9852803" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 15:40:53 crc kubenswrapper[4696]: I0318 15:40:53.012014 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 15:40:53 crc kubenswrapper[4696]: I0318 15:40:53.143872 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 15:40:53 crc kubenswrapper[4696]: I0318 15:40:53.224932 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 15:40:53 crc kubenswrapper[4696]: I0318 15:40:53.340034 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 15:40:53 crc kubenswrapper[4696]: I0318 15:40:53.441570 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 15:40:53 crc kubenswrapper[4696]: I0318 15:40:53.488813 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 15:40:53 crc kubenswrapper[4696]: I0318 15:40:53.495331 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 15:40:53 crc kubenswrapper[4696]: I0318 15:40:53.515881 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 15:40:53 crc kubenswrapper[4696]: I0318 15:40:53.549998 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 15:40:53 crc kubenswrapper[4696]: I0318 15:40:53.657779 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 15:40:53 crc kubenswrapper[4696]: I0318 15:40:53.801369 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 15:40:53 crc kubenswrapper[4696]: I0318 15:40:53.815163 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 15:40:53 crc kubenswrapper[4696]: I0318 15:40:53.836029 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 15:40:53 crc kubenswrapper[4696]: I0318 15:40:53.849773 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 15:40:53 crc kubenswrapper[4696]: I0318 15:40:53.933823 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 15:40:53 crc kubenswrapper[4696]: I0318 15:40:53.967803 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 15:40:54 crc kubenswrapper[4696]: I0318 15:40:54.000386 4696 patch_prober.go:28] interesting pod/route-controller-manager-5c7b7b5585-55dj4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:40:54 crc kubenswrapper[4696]: I0318 15:40:54.000454 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" podUID="83b8e906-da43-4ac7-8700-4689c9852803" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 15:40:54 crc kubenswrapper[4696]: I0318 15:40:54.074431 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 15:40:54 crc kubenswrapper[4696]: I0318 15:40:54.170856 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 15:40:54 crc kubenswrapper[4696]: I0318 15:40:54.290615 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 15:40:54 crc kubenswrapper[4696]: I0318 15:40:54.312775 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 15:40:54 crc kubenswrapper[4696]: I0318 15:40:54.316314 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 15:40:54 crc kubenswrapper[4696]: I0318 15:40:54.351707 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 15:40:54 crc kubenswrapper[4696]: I0318 15:40:54.499489 4696 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 15:40:54 crc kubenswrapper[4696]: I0318 15:40:54.655249 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 15:40:54 crc kubenswrapper[4696]: I0318 15:40:54.669412 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 15:40:54 crc kubenswrapper[4696]: I0318 15:40:54.910837 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 15:40:54 crc kubenswrapper[4696]: I0318 15:40:54.911226 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 15:40:54 crc kubenswrapper[4696]: I0318 15:40:54.940641 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 15:40:54 crc kubenswrapper[4696]: I0318 15:40:54.950998 4696 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 15:40:55 crc kubenswrapper[4696]: I0318 15:40:55.003815 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 15:40:55 crc kubenswrapper[4696]: I0318 15:40:55.134860 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 15:40:55 crc kubenswrapper[4696]: I0318 15:40:55.174138 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 15:40:55 crc kubenswrapper[4696]: I0318 15:40:55.239926 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 15:40:55 crc kubenswrapper[4696]: I0318 15:40:55.298742 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 15:40:55 crc kubenswrapper[4696]: I0318 15:40:55.333408 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 15:40:55 crc kubenswrapper[4696]: I0318 15:40:55.386032 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 15:40:55 crc kubenswrapper[4696]: I0318 15:40:55.388182 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 15:40:55 crc kubenswrapper[4696]: I0318 15:40:55.388182 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 15:40:55 crc kubenswrapper[4696]: I0318 15:40:55.433449 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 15:40:55 crc kubenswrapper[4696]: I0318 15:40:55.439260 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 15:40:55 crc kubenswrapper[4696]: I0318 15:40:55.472407 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 15:40:55 crc kubenswrapper[4696]: I0318 15:40:55.494136 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 15:40:55 crc kubenswrapper[4696]: I0318 15:40:55.514937 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 15:40:55 crc kubenswrapper[4696]: I0318 15:40:55.535281 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 15:40:55 crc kubenswrapper[4696]: I0318 15:40:55.613233 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 15:40:55 crc kubenswrapper[4696]: I0318 15:40:55.635985 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 15:40:55 crc kubenswrapper[4696]: I0318 15:40:55.659505 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 15:40:55 crc kubenswrapper[4696]: I0318 15:40:55.698340 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 15:40:55 crc kubenswrapper[4696]: I0318 15:40:55.843015 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 15:40:55 crc kubenswrapper[4696]: I0318 15:40:55.888664 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 15:40:55 crc kubenswrapper[4696]: I0318 15:40:55.914566 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 15:40:55 crc kubenswrapper[4696]: I0318 15:40:55.926631 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 15:40:56 crc kubenswrapper[4696]: I0318 15:40:56.021099 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 15:40:56 crc kubenswrapper[4696]: I0318 15:40:56.036770 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 15:40:56 crc kubenswrapper[4696]: I0318 15:40:56.073571 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 15:40:56 crc kubenswrapper[4696]: I0318 15:40:56.097001 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 15:40:56 crc kubenswrapper[4696]: I0318 15:40:56.102414 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 15:40:56 crc kubenswrapper[4696]: I0318 15:40:56.180737 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 15:40:56 crc kubenswrapper[4696]: I0318 15:40:56.289910 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 15:40:56 crc kubenswrapper[4696]: I0318 15:40:56.310715 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 15:40:56 crc kubenswrapper[4696]: I0318 15:40:56.368452 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 15:40:56 crc kubenswrapper[4696]: I0318 15:40:56.429971 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 15:40:56 crc kubenswrapper[4696]: I0318 15:40:56.500721 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 15:40:56 crc kubenswrapper[4696]: I0318 15:40:56.541863 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 15:40:56 crc kubenswrapper[4696]: I0318 15:40:56.552660 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 15:40:56 crc kubenswrapper[4696]: I0318 15:40:56.604375 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 15:40:56 crc kubenswrapper[4696]: I0318 15:40:56.632152 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 15:40:56 crc kubenswrapper[4696]: I0318 15:40:56.686815 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 15:40:56 crc kubenswrapper[4696]: I0318 15:40:56.703213 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 15:40:56 crc kubenswrapper[4696]: I0318 15:40:56.926841 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 15:40:56 crc kubenswrapper[4696]: I0318 15:40:56.951137 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 15:40:57 crc kubenswrapper[4696]: I0318 15:40:57.001582 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 15:40:57 crc kubenswrapper[4696]: I0318 15:40:57.033910 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 15:40:57 crc kubenswrapper[4696]: I0318 15:40:57.051354 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 15:40:57 crc kubenswrapper[4696]: I0318 15:40:57.057606 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 15:40:57 crc kubenswrapper[4696]: I0318 15:40:57.102141 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 15:40:57 crc kubenswrapper[4696]: I0318 15:40:57.124025 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 15:40:57 crc kubenswrapper[4696]: I0318 15:40:57.160210 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 15:40:57 crc kubenswrapper[4696]: I0318 15:40:57.250211 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 15:40:57 crc kubenswrapper[4696]: I0318 15:40:57.363098 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 15:40:57 crc kubenswrapper[4696]: I0318 15:40:57.375912 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 15:40:57 crc kubenswrapper[4696]: I0318 15:40:57.474663 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 15:40:57 crc kubenswrapper[4696]: I0318 15:40:57.507277 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 15:40:57 crc kubenswrapper[4696]: I0318 15:40:57.556172 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 15:40:57 crc kubenswrapper[4696]: I0318 15:40:57.569791 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 15:40:57 crc kubenswrapper[4696]: I0318 15:40:57.585685 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 15:40:57 crc kubenswrapper[4696]: I0318 15:40:57.627295 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 15:40:57 crc kubenswrapper[4696]: I0318 15:40:57.638796 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 15:40:57 crc kubenswrapper[4696]: I0318 15:40:57.732210 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 15:40:57 crc kubenswrapper[4696]: I0318 15:40:57.753113 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 15:40:57 crc kubenswrapper[4696]: I0318 15:40:57.827311 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 15:40:57 crc kubenswrapper[4696]: I0318 15:40:57.886095 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 15:40:58 crc kubenswrapper[4696]: I0318 15:40:58.007897 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 15:40:58 crc kubenswrapper[4696]: I0318 15:40:58.077090 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 15:40:58 crc kubenswrapper[4696]: I0318 15:40:58.093662 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 15:40:58 crc kubenswrapper[4696]: I0318 15:40:58.096054 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 15:40:58 crc kubenswrapper[4696]: I0318 15:40:58.105551 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 15:40:58 crc kubenswrapper[4696]: I0318 15:40:58.160063 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 15:40:58 crc kubenswrapper[4696]: I0318 15:40:58.231558 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 15:40:58 crc kubenswrapper[4696]: I0318 15:40:58.283612 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 15:40:58 crc kubenswrapper[4696]: I0318 15:40:58.319256 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 15:40:58 crc kubenswrapper[4696]: I0318 15:40:58.345940 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 15:40:58 crc kubenswrapper[4696]: I0318 15:40:58.475625 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 15:40:58 crc kubenswrapper[4696]: I0318 15:40:58.708598 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 15:40:58 crc kubenswrapper[4696]: I0318 15:40:58.710783 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 15:40:58 crc kubenswrapper[4696]: I0318 15:40:58.832846 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 15:40:58 crc kubenswrapper[4696]: I0318 15:40:58.930427 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 15:40:58 crc kubenswrapper[4696]: I0318 15:40:58.947163 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 15:40:58 crc kubenswrapper[4696]: I0318 15:40:58.952271 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 15:40:59 crc kubenswrapper[4696]: I0318 15:40:59.015092 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 15:40:59 crc kubenswrapper[4696]: I0318 15:40:59.116780 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 15:40:59 crc kubenswrapper[4696]: I0318 15:40:59.170374 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c7b7b5585-55dj4" Mar 18 15:40:59 crc kubenswrapper[4696]: I0318 15:40:59.218380 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 15:40:59 crc kubenswrapper[4696]: I0318 15:40:59.271882 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 15:40:59 crc kubenswrapper[4696]: I0318 15:40:59.305112 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 15:40:59 crc kubenswrapper[4696]: I0318 15:40:59.331969 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 15:40:59 crc kubenswrapper[4696]: I0318 15:40:59.344318 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 15:40:59 crc kubenswrapper[4696]: I0318 15:40:59.346850 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 15:40:59 crc kubenswrapper[4696]: I0318 15:40:59.386231 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 15:40:59 crc kubenswrapper[4696]: I0318 15:40:59.413979 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 15:40:59 crc kubenswrapper[4696]: I0318 15:40:59.613645 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 15:40:59 crc kubenswrapper[4696]: I0318 15:40:59.632057 4696 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 15:40:59 crc kubenswrapper[4696]: I0318 15:40:59.794538 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 15:40:59 crc kubenswrapper[4696]: I0318 15:40:59.809964 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 15:40:59 crc kubenswrapper[4696]: I0318 15:40:59.836331 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 15:40:59 crc kubenswrapper[4696]: I0318 15:40:59.852360 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 15:40:59 crc kubenswrapper[4696]: I0318 15:40:59.888674 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 15:40:59 crc kubenswrapper[4696]: I0318 15:40:59.906615 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 15:40:59 crc kubenswrapper[4696]: I0318 15:40:59.933059 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 15:40:59 crc kubenswrapper[4696]: I0318 15:40:59.951379 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 15:41:00 crc kubenswrapper[4696]: I0318 15:41:00.072278 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 15:41:00 crc kubenswrapper[4696]: I0318 15:41:00.076463 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 15:41:00 crc kubenswrapper[4696]: I0318 15:41:00.124609 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 15:41:00 crc kubenswrapper[4696]: I0318 15:41:00.234066 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 15:41:00 crc kubenswrapper[4696]: I0318 15:41:00.241753 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 15:41:00 crc kubenswrapper[4696]: I0318 15:41:00.250308 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 15:41:00 crc kubenswrapper[4696]: I0318 15:41:00.267681 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 15:41:00 crc kubenswrapper[4696]: I0318 15:41:00.335582 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 15:41:00 crc kubenswrapper[4696]: I0318 15:41:00.665201 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 15:41:00 crc kubenswrapper[4696]: I0318 15:41:00.767444 4696 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 15:41:00 crc kubenswrapper[4696]: I0318 15:41:00.833478 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 15:41:00 crc kubenswrapper[4696]: I0318 15:41:00.875177 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 15:41:01 crc kubenswrapper[4696]: I0318 15:41:01.010447 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 15:41:01 crc kubenswrapper[4696]: I0318 15:41:01.025450 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 15:41:01 crc kubenswrapper[4696]: I0318 15:41:01.146233 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 15:41:01 crc kubenswrapper[4696]: I0318 15:41:01.158171 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 15:41:01 crc kubenswrapper[4696]: I0318 15:41:01.206626 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 15:41:01 crc kubenswrapper[4696]: I0318 15:41:01.297970 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 15:41:01 crc kubenswrapper[4696]: I0318 15:41:01.320053 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 15:41:01 crc kubenswrapper[4696]: I0318 15:41:01.349139 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 15:41:01 crc kubenswrapper[4696]: I0318 15:41:01.390756 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 15:41:01 crc kubenswrapper[4696]: I0318 15:41:01.453914 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 15:41:01 crc kubenswrapper[4696]: I0318 15:41:01.455376 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 15:41:01 crc kubenswrapper[4696]: I0318 15:41:01.531423 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 15:41:01 crc kubenswrapper[4696]: I0318 15:41:01.559841 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 15:41:01 crc kubenswrapper[4696]: I0318 15:41:01.574324 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 15:41:01 crc kubenswrapper[4696]: I0318 15:41:01.655202 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 15:41:01 crc kubenswrapper[4696]: I0318 15:41:01.740284 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 15:41:01 crc kubenswrapper[4696]: I0318 15:41:01.794066 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 15:41:01 crc kubenswrapper[4696]: I0318 15:41:01.799175 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 15:41:01 crc kubenswrapper[4696]: I0318 15:41:01.839296 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 15:41:01 crc kubenswrapper[4696]: I0318 15:41:01.866737 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 15:41:01 crc kubenswrapper[4696]: I0318 15:41:01.887125 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 15:41:01 crc kubenswrapper[4696]: I0318 15:41:01.921475 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 15:41:01 crc kubenswrapper[4696]: I0318 15:41:01.927852 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 15:41:01 crc kubenswrapper[4696]: I0318 15:41:01.968065 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 15:41:02 crc kubenswrapper[4696]: I0318 15:41:02.068289 4696 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 15:41:02 crc kubenswrapper[4696]: I0318 15:41:02.068738 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://5ff856bdfe7f41495a16e373c3b1f37ec16645096184a3114efe97d8e07027f5" gracePeriod=5 Mar 18 15:41:02 crc kubenswrapper[4696]: I0318 15:41:02.076869 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 15:41:02 crc kubenswrapper[4696]: I0318 15:41:02.274539 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 15:41:02 crc kubenswrapper[4696]: I0318 15:41:02.302677 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 15:41:02 crc kubenswrapper[4696]: I0318 15:41:02.406293 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 15:41:02 crc kubenswrapper[4696]: I0318 15:41:02.411831 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 15:41:02 crc kubenswrapper[4696]: I0318 15:41:02.553573 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 15:41:02 crc kubenswrapper[4696]: I0318 15:41:02.576425 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 15:41:02 crc kubenswrapper[4696]: I0318 15:41:02.577051 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 15:41:02 crc kubenswrapper[4696]: I0318 15:41:02.618139 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 15:41:02 crc kubenswrapper[4696]: I0318 15:41:02.759278 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 15:41:02 crc kubenswrapper[4696]: I0318 15:41:02.868514 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 15:41:02 crc kubenswrapper[4696]: I0318 15:41:02.986504 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 15:41:03 crc kubenswrapper[4696]: I0318 15:41:03.103886 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 15:41:03 crc kubenswrapper[4696]: I0318 15:41:03.117858 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 15:41:03 crc kubenswrapper[4696]: I0318 15:41:03.121242 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 15:41:03 crc kubenswrapper[4696]: I0318 15:41:03.244626 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 15:41:03 crc kubenswrapper[4696]: I0318 15:41:03.248964 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 15:41:03 crc kubenswrapper[4696]: I0318 15:41:03.266128 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 15:41:03 crc kubenswrapper[4696]: I0318 15:41:03.396213 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 15:41:03 crc kubenswrapper[4696]: I0318 15:41:03.402655 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 15:41:03 crc kubenswrapper[4696]: I0318 15:41:03.518787 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 15:41:03 crc kubenswrapper[4696]: I0318 15:41:03.563311 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 15:41:03 crc kubenswrapper[4696]: I0318 15:41:03.568123 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 15:41:03 crc kubenswrapper[4696]: I0318 15:41:03.581470 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 15:41:03 crc kubenswrapper[4696]: I0318 15:41:03.861314 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 15:41:03 crc kubenswrapper[4696]: I0318 15:41:03.899592 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 15:41:03 crc kubenswrapper[4696]: I0318 15:41:03.904816 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 15:41:03 crc kubenswrapper[4696]: I0318 15:41:03.933148 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 15:41:03 crc kubenswrapper[4696]: I0318 15:41:03.972467 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 15:41:03 crc kubenswrapper[4696]: I0318 15:41:03.988561 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 15:41:04 crc kubenswrapper[4696]: I0318 15:41:04.022475 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 15:41:04 crc kubenswrapper[4696]: I0318 15:41:04.066685 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 15:41:04 crc kubenswrapper[4696]: I0318 15:41:04.122838 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 15:41:04 crc kubenswrapper[4696]: I0318 15:41:04.134959 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 15:41:04 crc kubenswrapper[4696]: I0318 15:41:04.179971 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 15:41:04 crc kubenswrapper[4696]: I0318 15:41:04.215667 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 15:41:04 crc kubenswrapper[4696]: I0318 15:41:04.230615 4696 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 15:41:04 crc kubenswrapper[4696]: I0318 15:41:04.297463 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 15:41:04 crc kubenswrapper[4696]: I0318 15:41:04.373504 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 15:41:04 crc kubenswrapper[4696]: I0318 15:41:04.652654 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 15:41:04 crc kubenswrapper[4696]: I0318 15:41:04.701883 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 15:41:04 crc kubenswrapper[4696]: I0318 15:41:04.728956 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 15:41:04 crc kubenswrapper[4696]: I0318 15:41:04.838098 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 15:41:04 crc kubenswrapper[4696]: I0318 15:41:04.864884 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 15:41:04 crc kubenswrapper[4696]: I0318 15:41:04.986014 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 15:41:04 crc kubenswrapper[4696]: I0318 15:41:04.987416 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 15:41:05 crc kubenswrapper[4696]: I0318 15:41:05.014373 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 15:41:05 crc kubenswrapper[4696]: I0318 15:41:05.045815 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 15:41:05 crc kubenswrapper[4696]: I0318 15:41:05.114960 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 15:41:05 crc kubenswrapper[4696]: I0318 15:41:05.143306 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 15:41:05 crc kubenswrapper[4696]: I0318 15:41:05.249432 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 15:41:05 crc kubenswrapper[4696]: I0318 15:41:05.426541 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 15:41:05 crc kubenswrapper[4696]: I0318 15:41:05.549966 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 15:41:05 crc kubenswrapper[4696]: I0318 15:41:05.593242 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 15:41:05 crc kubenswrapper[4696]: I0318 15:41:05.642852 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 15:41:06 crc kubenswrapper[4696]: I0318 15:41:06.159675 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 15:41:06 crc kubenswrapper[4696]: I0318 15:41:06.758306 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 15:41:06 crc kubenswrapper[4696]: I0318 15:41:06.794862 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 15:41:06 crc kubenswrapper[4696]: I0318 15:41:06.874903 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 15:41:07 crc kubenswrapper[4696]: I0318 15:41:07.144136 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 15:41:07 crc kubenswrapper[4696]: I0318 15:41:07.206497 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 15:41:07 crc kubenswrapper[4696]: I0318 15:41:07.257238 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 15:41:07 crc kubenswrapper[4696]: I0318 15:41:07.368314 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 15:41:07 crc kubenswrapper[4696]: I0318 15:41:07.370321 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 15:41:07 crc kubenswrapper[4696]: I0318 15:41:07.475724 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 15:41:07 crc kubenswrapper[4696]: I0318 15:41:07.545125 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 15:41:07 crc kubenswrapper[4696]: I0318 15:41:07.634934 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 15:41:07 crc kubenswrapper[4696]: I0318 15:41:07.635021 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:41:07 crc kubenswrapper[4696]: I0318 15:41:07.769898 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 15:41:07 crc kubenswrapper[4696]: I0318 15:41:07.770003 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 15:41:07 crc kubenswrapper[4696]: I0318 15:41:07.770060 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 15:41:07 crc kubenswrapper[4696]: I0318 15:41:07.770088 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 15:41:07 crc kubenswrapper[4696]: I0318 15:41:07.770106 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 15:41:07 crc kubenswrapper[4696]: I0318 15:41:07.770405 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:41:07 crc kubenswrapper[4696]: I0318 15:41:07.770471 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:41:07 crc kubenswrapper[4696]: I0318 15:41:07.770473 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:41:07 crc kubenswrapper[4696]: I0318 15:41:07.770611 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:41:07 crc kubenswrapper[4696]: I0318 15:41:07.778569 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:41:07 crc kubenswrapper[4696]: I0318 15:41:07.871366 4696 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:07 crc kubenswrapper[4696]: I0318 15:41:07.871406 4696 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:07 crc kubenswrapper[4696]: I0318 15:41:07.871423 4696 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:07 crc kubenswrapper[4696]: I0318 15:41:07.871433 4696 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:07 crc kubenswrapper[4696]: I0318 15:41:07.871447 4696 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:08 crc kubenswrapper[4696]: I0318 15:41:08.064445 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 15:41:08 crc kubenswrapper[4696]: I0318 15:41:08.064731 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 15:41:08 crc kubenswrapper[4696]: I0318 15:41:08.074131 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 15:41:08 crc kubenswrapper[4696]: I0318 15:41:08.074178 4696 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="5ff856bdfe7f41495a16e373c3b1f37ec16645096184a3114efe97d8e07027f5" exitCode=137 Mar 18 15:41:08 crc kubenswrapper[4696]: I0318 15:41:08.074237 4696 scope.go:117] "RemoveContainer" containerID="5ff856bdfe7f41495a16e373c3b1f37ec16645096184a3114efe97d8e07027f5" Mar 18 15:41:08 crc kubenswrapper[4696]: I0318 15:41:08.074279 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:41:08 crc kubenswrapper[4696]: I0318 15:41:08.097317 4696 scope.go:117] "RemoveContainer" containerID="5ff856bdfe7f41495a16e373c3b1f37ec16645096184a3114efe97d8e07027f5" Mar 18 15:41:08 crc kubenswrapper[4696]: E0318 15:41:08.097748 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ff856bdfe7f41495a16e373c3b1f37ec16645096184a3114efe97d8e07027f5\": container with ID starting with 5ff856bdfe7f41495a16e373c3b1f37ec16645096184a3114efe97d8e07027f5 not found: ID does not exist" containerID="5ff856bdfe7f41495a16e373c3b1f37ec16645096184a3114efe97d8e07027f5" Mar 18 15:41:08 crc kubenswrapper[4696]: I0318 15:41:08.097873 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff856bdfe7f41495a16e373c3b1f37ec16645096184a3114efe97d8e07027f5"} err="failed to get container status \"5ff856bdfe7f41495a16e373c3b1f37ec16645096184a3114efe97d8e07027f5\": rpc error: code = NotFound desc = could not find container \"5ff856bdfe7f41495a16e373c3b1f37ec16645096184a3114efe97d8e07027f5\": container with ID starting with 5ff856bdfe7f41495a16e373c3b1f37ec16645096184a3114efe97d8e07027f5 not found: ID does not exist" Mar 18 15:41:08 crc kubenswrapper[4696]: I0318 15:41:08.496814 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 15:41:08 crc kubenswrapper[4696]: I0318 15:41:08.775044 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 15:41:09 crc kubenswrapper[4696]: I0318 15:41:09.105994 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 15:41:09 crc kubenswrapper[4696]: I0318 15:41:09.605201 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 18 15:41:09 crc kubenswrapper[4696]: I0318 15:41:09.605457 4696 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 18 15:41:09 crc kubenswrapper[4696]: I0318 15:41:09.617332 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 15:41:09 crc kubenswrapper[4696]: I0318 15:41:09.617373 4696 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="5772c118-2c37-4c47-8a23-d0be7471d00f" Mar 18 15:41:09 crc kubenswrapper[4696]: I0318 15:41:09.620314 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 15:41:09 crc kubenswrapper[4696]: I0318 15:41:09.620354 4696 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="5772c118-2c37-4c47-8a23-d0be7471d00f" Mar 18 15:41:26 crc kubenswrapper[4696]: I0318 15:41:26.807886 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-485wk"] Mar 18 15:41:48 crc kubenswrapper[4696]: I0318 15:41:48.443200 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ldd6n"] Mar 18 15:41:48 crc kubenswrapper[4696]: I0318 15:41:48.443965 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ldd6n" podUID="459c4b74-f710-4b3a-b053-8b0326b87cb5" containerName="registry-server" containerID="cri-o://72e86da246f5b4b74e974090a72f4fa2dfdd3bc490ac7e7315853777471d8235" gracePeriod=2 Mar 18 15:41:48 crc kubenswrapper[4696]: I0318 15:41:48.652831 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sjnmp"] Mar 18 15:41:48 crc kubenswrapper[4696]: I0318 15:41:48.653754 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sjnmp" podUID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" containerName="registry-server" containerID="cri-o://76569577d3bd4f92b9aa620a4f55bfa105a93cf68b4f61e903d3f234d08696ae" gracePeriod=2 Mar 18 15:41:48 crc kubenswrapper[4696]: I0318 15:41:48.799132 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ldd6n" Mar 18 15:41:48 crc kubenswrapper[4696]: I0318 15:41:48.902256 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459c4b74-f710-4b3a-b053-8b0326b87cb5-utilities\") pod \"459c4b74-f710-4b3a-b053-8b0326b87cb5\" (UID: \"459c4b74-f710-4b3a-b053-8b0326b87cb5\") " Mar 18 15:41:48 crc kubenswrapper[4696]: I0318 15:41:48.902321 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46xmj\" (UniqueName: \"kubernetes.io/projected/459c4b74-f710-4b3a-b053-8b0326b87cb5-kube-api-access-46xmj\") pod \"459c4b74-f710-4b3a-b053-8b0326b87cb5\" (UID: \"459c4b74-f710-4b3a-b053-8b0326b87cb5\") " Mar 18 15:41:48 crc kubenswrapper[4696]: I0318 15:41:48.902366 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459c4b74-f710-4b3a-b053-8b0326b87cb5-catalog-content\") pod \"459c4b74-f710-4b3a-b053-8b0326b87cb5\" (UID: \"459c4b74-f710-4b3a-b053-8b0326b87cb5\") " Mar 18 15:41:48 crc kubenswrapper[4696]: I0318 15:41:48.906743 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/459c4b74-f710-4b3a-b053-8b0326b87cb5-utilities" (OuterVolumeSpecName: "utilities") pod "459c4b74-f710-4b3a-b053-8b0326b87cb5" (UID: "459c4b74-f710-4b3a-b053-8b0326b87cb5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:41:48 crc kubenswrapper[4696]: I0318 15:41:48.910020 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/459c4b74-f710-4b3a-b053-8b0326b87cb5-kube-api-access-46xmj" (OuterVolumeSpecName: "kube-api-access-46xmj") pod "459c4b74-f710-4b3a-b053-8b0326b87cb5" (UID: "459c4b74-f710-4b3a-b053-8b0326b87cb5"). InnerVolumeSpecName "kube-api-access-46xmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:41:48 crc kubenswrapper[4696]: I0318 15:41:48.954013 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/459c4b74-f710-4b3a-b053-8b0326b87cb5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "459c4b74-f710-4b3a-b053-8b0326b87cb5" (UID: "459c4b74-f710-4b3a-b053-8b0326b87cb5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.003879 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459c4b74-f710-4b3a-b053-8b0326b87cb5-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.003913 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46xmj\" (UniqueName: \"kubernetes.io/projected/459c4b74-f710-4b3a-b053-8b0326b87cb5-kube-api-access-46xmj\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.003924 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459c4b74-f710-4b3a-b053-8b0326b87cb5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.061866 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sjnmp" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.105189 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0569eea-b948-4633-91d7-4ebfa02d5a8b-utilities\") pod \"b0569eea-b948-4633-91d7-4ebfa02d5a8b\" (UID: \"b0569eea-b948-4633-91d7-4ebfa02d5a8b\") " Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.105774 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4xdg\" (UniqueName: \"kubernetes.io/projected/b0569eea-b948-4633-91d7-4ebfa02d5a8b-kube-api-access-g4xdg\") pod \"b0569eea-b948-4633-91d7-4ebfa02d5a8b\" (UID: \"b0569eea-b948-4633-91d7-4ebfa02d5a8b\") " Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.105925 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0569eea-b948-4633-91d7-4ebfa02d5a8b-catalog-content\") pod \"b0569eea-b948-4633-91d7-4ebfa02d5a8b\" (UID: \"b0569eea-b948-4633-91d7-4ebfa02d5a8b\") " Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.106019 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0569eea-b948-4633-91d7-4ebfa02d5a8b-utilities" (OuterVolumeSpecName: "utilities") pod "b0569eea-b948-4633-91d7-4ebfa02d5a8b" (UID: "b0569eea-b948-4633-91d7-4ebfa02d5a8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.106347 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0569eea-b948-4633-91d7-4ebfa02d5a8b-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.108990 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0569eea-b948-4633-91d7-4ebfa02d5a8b-kube-api-access-g4xdg" (OuterVolumeSpecName: "kube-api-access-g4xdg") pod "b0569eea-b948-4633-91d7-4ebfa02d5a8b" (UID: "b0569eea-b948-4633-91d7-4ebfa02d5a8b"). InnerVolumeSpecName "kube-api-access-g4xdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.160829 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0569eea-b948-4633-91d7-4ebfa02d5a8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0569eea-b948-4633-91d7-4ebfa02d5a8b" (UID: "b0569eea-b948-4633-91d7-4ebfa02d5a8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.207846 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4xdg\" (UniqueName: \"kubernetes.io/projected/b0569eea-b948-4633-91d7-4ebfa02d5a8b-kube-api-access-g4xdg\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.207873 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0569eea-b948-4633-91d7-4ebfa02d5a8b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.305047 4696 generic.go:334] "Generic (PLEG): container finished" podID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" containerID="76569577d3bd4f92b9aa620a4f55bfa105a93cf68b4f61e903d3f234d08696ae" exitCode=0 Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.305117 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjnmp" event={"ID":"b0569eea-b948-4633-91d7-4ebfa02d5a8b","Type":"ContainerDied","Data":"76569577d3bd4f92b9aa620a4f55bfa105a93cf68b4f61e903d3f234d08696ae"} Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.305131 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sjnmp" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.305143 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjnmp" event={"ID":"b0569eea-b948-4633-91d7-4ebfa02d5a8b","Type":"ContainerDied","Data":"de5dad7bfdb6ca619e578404c5ca190b46cc313e4fc55f84ade1b85ef8e6974e"} Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.305160 4696 scope.go:117] "RemoveContainer" containerID="76569577d3bd4f92b9aa620a4f55bfa105a93cf68b4f61e903d3f234d08696ae" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.307170 4696 generic.go:334] "Generic (PLEG): container finished" podID="459c4b74-f710-4b3a-b053-8b0326b87cb5" containerID="72e86da246f5b4b74e974090a72f4fa2dfdd3bc490ac7e7315853777471d8235" exitCode=0 Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.307193 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ldd6n" event={"ID":"459c4b74-f710-4b3a-b053-8b0326b87cb5","Type":"ContainerDied","Data":"72e86da246f5b4b74e974090a72f4fa2dfdd3bc490ac7e7315853777471d8235"} Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.307209 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ldd6n" event={"ID":"459c4b74-f710-4b3a-b053-8b0326b87cb5","Type":"ContainerDied","Data":"b02f923bfeebc6cccc44585446fb6d69cf59b1a3f79ebcefc84c84a3af457145"} Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.307299 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ldd6n" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.322392 4696 scope.go:117] "RemoveContainer" containerID="97d41c253a76ab8ed95206382e7bde60ac2c022e3508441d8d60b055243423b3" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.339060 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sjnmp"] Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.345767 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sjnmp"] Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.350399 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ldd6n"] Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.353932 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ldd6n"] Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.359125 4696 scope.go:117] "RemoveContainer" containerID="3de20bdffa150eda91b07bfcc49ac946dc75b6d28734709ca1b4489441c2d055" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.371343 4696 scope.go:117] "RemoveContainer" containerID="76569577d3bd4f92b9aa620a4f55bfa105a93cf68b4f61e903d3f234d08696ae" Mar 18 15:41:49 crc kubenswrapper[4696]: E0318 15:41:49.371742 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76569577d3bd4f92b9aa620a4f55bfa105a93cf68b4f61e903d3f234d08696ae\": container with ID starting with 76569577d3bd4f92b9aa620a4f55bfa105a93cf68b4f61e903d3f234d08696ae not found: ID does not exist" containerID="76569577d3bd4f92b9aa620a4f55bfa105a93cf68b4f61e903d3f234d08696ae" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.371784 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76569577d3bd4f92b9aa620a4f55bfa105a93cf68b4f61e903d3f234d08696ae"} err="failed to get container status \"76569577d3bd4f92b9aa620a4f55bfa105a93cf68b4f61e903d3f234d08696ae\": rpc error: code = NotFound desc = could not find container \"76569577d3bd4f92b9aa620a4f55bfa105a93cf68b4f61e903d3f234d08696ae\": container with ID starting with 76569577d3bd4f92b9aa620a4f55bfa105a93cf68b4f61e903d3f234d08696ae not found: ID does not exist" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.371812 4696 scope.go:117] "RemoveContainer" containerID="97d41c253a76ab8ed95206382e7bde60ac2c022e3508441d8d60b055243423b3" Mar 18 15:41:49 crc kubenswrapper[4696]: E0318 15:41:49.372228 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97d41c253a76ab8ed95206382e7bde60ac2c022e3508441d8d60b055243423b3\": container with ID starting with 97d41c253a76ab8ed95206382e7bde60ac2c022e3508441d8d60b055243423b3 not found: ID does not exist" containerID="97d41c253a76ab8ed95206382e7bde60ac2c022e3508441d8d60b055243423b3" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.372264 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97d41c253a76ab8ed95206382e7bde60ac2c022e3508441d8d60b055243423b3"} err="failed to get container status \"97d41c253a76ab8ed95206382e7bde60ac2c022e3508441d8d60b055243423b3\": rpc error: code = NotFound desc = could not find container \"97d41c253a76ab8ed95206382e7bde60ac2c022e3508441d8d60b055243423b3\": container with ID starting with 97d41c253a76ab8ed95206382e7bde60ac2c022e3508441d8d60b055243423b3 not found: ID does not exist" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.372286 4696 scope.go:117] "RemoveContainer" containerID="3de20bdffa150eda91b07bfcc49ac946dc75b6d28734709ca1b4489441c2d055" Mar 18 15:41:49 crc kubenswrapper[4696]: E0318 15:41:49.372604 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3de20bdffa150eda91b07bfcc49ac946dc75b6d28734709ca1b4489441c2d055\": container with ID starting with 3de20bdffa150eda91b07bfcc49ac946dc75b6d28734709ca1b4489441c2d055 not found: ID does not exist" containerID="3de20bdffa150eda91b07bfcc49ac946dc75b6d28734709ca1b4489441c2d055" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.372629 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de20bdffa150eda91b07bfcc49ac946dc75b6d28734709ca1b4489441c2d055"} err="failed to get container status \"3de20bdffa150eda91b07bfcc49ac946dc75b6d28734709ca1b4489441c2d055\": rpc error: code = NotFound desc = could not find container \"3de20bdffa150eda91b07bfcc49ac946dc75b6d28734709ca1b4489441c2d055\": container with ID starting with 3de20bdffa150eda91b07bfcc49ac946dc75b6d28734709ca1b4489441c2d055 not found: ID does not exist" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.372644 4696 scope.go:117] "RemoveContainer" containerID="72e86da246f5b4b74e974090a72f4fa2dfdd3bc490ac7e7315853777471d8235" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.383797 4696 scope.go:117] "RemoveContainer" containerID="60e81796c613617c7df3de30b93abd6d937d34a1394fd7f6728d634d34f7268e" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.398145 4696 scope.go:117] "RemoveContainer" containerID="cbc0c8dbceeccfb2f92d1b0d78837ef6e8e790791c5796d756ecf7cd99bb0064" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.409453 4696 scope.go:117] "RemoveContainer" containerID="72e86da246f5b4b74e974090a72f4fa2dfdd3bc490ac7e7315853777471d8235" Mar 18 15:41:49 crc kubenswrapper[4696]: E0318 15:41:49.409910 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72e86da246f5b4b74e974090a72f4fa2dfdd3bc490ac7e7315853777471d8235\": container with ID starting with 72e86da246f5b4b74e974090a72f4fa2dfdd3bc490ac7e7315853777471d8235 not found: ID does not exist" containerID="72e86da246f5b4b74e974090a72f4fa2dfdd3bc490ac7e7315853777471d8235" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.409955 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72e86da246f5b4b74e974090a72f4fa2dfdd3bc490ac7e7315853777471d8235"} err="failed to get container status \"72e86da246f5b4b74e974090a72f4fa2dfdd3bc490ac7e7315853777471d8235\": rpc error: code = NotFound desc = could not find container \"72e86da246f5b4b74e974090a72f4fa2dfdd3bc490ac7e7315853777471d8235\": container with ID starting with 72e86da246f5b4b74e974090a72f4fa2dfdd3bc490ac7e7315853777471d8235 not found: ID does not exist" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.409979 4696 scope.go:117] "RemoveContainer" containerID="60e81796c613617c7df3de30b93abd6d937d34a1394fd7f6728d634d34f7268e" Mar 18 15:41:49 crc kubenswrapper[4696]: E0318 15:41:49.410259 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60e81796c613617c7df3de30b93abd6d937d34a1394fd7f6728d634d34f7268e\": container with ID starting with 60e81796c613617c7df3de30b93abd6d937d34a1394fd7f6728d634d34f7268e not found: ID does not exist" containerID="60e81796c613617c7df3de30b93abd6d937d34a1394fd7f6728d634d34f7268e" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.410283 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60e81796c613617c7df3de30b93abd6d937d34a1394fd7f6728d634d34f7268e"} err="failed to get container status \"60e81796c613617c7df3de30b93abd6d937d34a1394fd7f6728d634d34f7268e\": rpc error: code = NotFound desc = could not find container \"60e81796c613617c7df3de30b93abd6d937d34a1394fd7f6728d634d34f7268e\": container with ID starting with 60e81796c613617c7df3de30b93abd6d937d34a1394fd7f6728d634d34f7268e not found: ID does not exist" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.410298 4696 scope.go:117] "RemoveContainer" containerID="cbc0c8dbceeccfb2f92d1b0d78837ef6e8e790791c5796d756ecf7cd99bb0064" Mar 18 15:41:49 crc kubenswrapper[4696]: E0318 15:41:49.410627 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbc0c8dbceeccfb2f92d1b0d78837ef6e8e790791c5796d756ecf7cd99bb0064\": container with ID starting with cbc0c8dbceeccfb2f92d1b0d78837ef6e8e790791c5796d756ecf7cd99bb0064 not found: ID does not exist" containerID="cbc0c8dbceeccfb2f92d1b0d78837ef6e8e790791c5796d756ecf7cd99bb0064" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.410655 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbc0c8dbceeccfb2f92d1b0d78837ef6e8e790791c5796d756ecf7cd99bb0064"} err="failed to get container status \"cbc0c8dbceeccfb2f92d1b0d78837ef6e8e790791c5796d756ecf7cd99bb0064\": rpc error: code = NotFound desc = could not find container \"cbc0c8dbceeccfb2f92d1b0d78837ef6e8e790791c5796d756ecf7cd99bb0064\": container with ID starting with cbc0c8dbceeccfb2f92d1b0d78837ef6e8e790791c5796d756ecf7cd99bb0064 not found: ID does not exist" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.605164 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="459c4b74-f710-4b3a-b053-8b0326b87cb5" path="/var/lib/kubelet/pods/459c4b74-f710-4b3a-b053-8b0326b87cb5/volumes" Mar 18 15:41:49 crc kubenswrapper[4696]: I0318 15:41:49.606623 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" path="/var/lib/kubelet/pods/b0569eea-b948-4633-91d7-4ebfa02d5a8b/volumes" Mar 18 15:41:51 crc kubenswrapper[4696]: I0318 15:41:51.243573 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q82hl"] Mar 18 15:41:51 crc kubenswrapper[4696]: I0318 15:41:51.243921 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q82hl" podUID="cfaa1769-2588-440e-9757-ded58dcb0ac3" containerName="registry-server" containerID="cri-o://87f868d2e0b51e30a352e6674a93813e1cf42c1fb914fc6fa6f44cc5e05c9d78" gracePeriod=2 Mar 18 15:41:51 crc kubenswrapper[4696]: I0318 15:41:51.646086 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q82hl" Mar 18 15:41:51 crc kubenswrapper[4696]: I0318 15:41:51.760624 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfaa1769-2588-440e-9757-ded58dcb0ac3-catalog-content\") pod \"cfaa1769-2588-440e-9757-ded58dcb0ac3\" (UID: \"cfaa1769-2588-440e-9757-ded58dcb0ac3\") " Mar 18 15:41:51 crc kubenswrapper[4696]: I0318 15:41:51.760730 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddhzn\" (UniqueName: \"kubernetes.io/projected/cfaa1769-2588-440e-9757-ded58dcb0ac3-kube-api-access-ddhzn\") pod \"cfaa1769-2588-440e-9757-ded58dcb0ac3\" (UID: \"cfaa1769-2588-440e-9757-ded58dcb0ac3\") " Mar 18 15:41:51 crc kubenswrapper[4696]: I0318 15:41:51.760839 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfaa1769-2588-440e-9757-ded58dcb0ac3-utilities\") pod \"cfaa1769-2588-440e-9757-ded58dcb0ac3\" (UID: \"cfaa1769-2588-440e-9757-ded58dcb0ac3\") " Mar 18 15:41:51 crc kubenswrapper[4696]: I0318 15:41:51.761927 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfaa1769-2588-440e-9757-ded58dcb0ac3-utilities" (OuterVolumeSpecName: "utilities") pod "cfaa1769-2588-440e-9757-ded58dcb0ac3" (UID: "cfaa1769-2588-440e-9757-ded58dcb0ac3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:41:51 crc kubenswrapper[4696]: I0318 15:41:51.765886 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfaa1769-2588-440e-9757-ded58dcb0ac3-kube-api-access-ddhzn" (OuterVolumeSpecName: "kube-api-access-ddhzn") pod "cfaa1769-2588-440e-9757-ded58dcb0ac3" (UID: "cfaa1769-2588-440e-9757-ded58dcb0ac3"). InnerVolumeSpecName "kube-api-access-ddhzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:41:51 crc kubenswrapper[4696]: I0318 15:41:51.843177 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-485wk" podUID="a589c8ef-17db-4df7-affb-8a40c753aaaa" containerName="oauth-openshift" containerID="cri-o://3b8b6835ca133944604761dba0aca9641f0ae62efd6047b4eb321034ffac36b7" gracePeriod=15 Mar 18 15:41:51 crc kubenswrapper[4696]: I0318 15:41:51.861892 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfaa1769-2588-440e-9757-ded58dcb0ac3-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:51 crc kubenswrapper[4696]: I0318 15:41:51.861944 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddhzn\" (UniqueName: \"kubernetes.io/projected/cfaa1769-2588-440e-9757-ded58dcb0ac3-kube-api-access-ddhzn\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:51 crc kubenswrapper[4696]: I0318 15:41:51.905163 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfaa1769-2588-440e-9757-ded58dcb0ac3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cfaa1769-2588-440e-9757-ded58dcb0ac3" (UID: "cfaa1769-2588-440e-9757-ded58dcb0ac3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:41:51 crc kubenswrapper[4696]: I0318 15:41:51.966182 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfaa1769-2588-440e-9757-ded58dcb0ac3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.183284 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.328088 4696 generic.go:334] "Generic (PLEG): container finished" podID="cfaa1769-2588-440e-9757-ded58dcb0ac3" containerID="87f868d2e0b51e30a352e6674a93813e1cf42c1fb914fc6fa6f44cc5e05c9d78" exitCode=0 Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.328489 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q82hl" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.328575 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q82hl" event={"ID":"cfaa1769-2588-440e-9757-ded58dcb0ac3","Type":"ContainerDied","Data":"87f868d2e0b51e30a352e6674a93813e1cf42c1fb914fc6fa6f44cc5e05c9d78"} Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.328613 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q82hl" event={"ID":"cfaa1769-2588-440e-9757-ded58dcb0ac3","Type":"ContainerDied","Data":"760e7def20f0e7e5d9f10888e3613f302cd10237ecd26c6e7ea21722e1dacb4e"} Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.328639 4696 scope.go:117] "RemoveContainer" containerID="87f868d2e0b51e30a352e6674a93813e1cf42c1fb914fc6fa6f44cc5e05c9d78" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.330760 4696 generic.go:334] "Generic (PLEG): container finished" podID="a589c8ef-17db-4df7-affb-8a40c753aaaa" containerID="3b8b6835ca133944604761dba0aca9641f0ae62efd6047b4eb321034ffac36b7" exitCode=0 Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.330805 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-485wk" event={"ID":"a589c8ef-17db-4df7-affb-8a40c753aaaa","Type":"ContainerDied","Data":"3b8b6835ca133944604761dba0aca9641f0ae62efd6047b4eb321034ffac36b7"} Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.330829 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-485wk" event={"ID":"a589c8ef-17db-4df7-affb-8a40c753aaaa","Type":"ContainerDied","Data":"25a5ad267780eb3af8c9d8dbcf28a087a20c3ea4016b8a55510b450eef7e06de"} Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.330891 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-485wk" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.356876 4696 scope.go:117] "RemoveContainer" containerID="320f3e588eb26897c29304c277c1200a3efd1ba29ef03a63bd342c8457d80b1e" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.361187 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q82hl"] Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.364220 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q82hl"] Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.371197 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-user-idp-0-file-data\") pod \"a589c8ef-17db-4df7-affb-8a40c753aaaa\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.371237 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-serving-cert\") pod \"a589c8ef-17db-4df7-affb-8a40c753aaaa\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.371299 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-service-ca\") pod \"a589c8ef-17db-4df7-affb-8a40c753aaaa\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.371320 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a589c8ef-17db-4df7-affb-8a40c753aaaa-audit-dir\") pod \"a589c8ef-17db-4df7-affb-8a40c753aaaa\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.371544 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a589c8ef-17db-4df7-affb-8a40c753aaaa-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "a589c8ef-17db-4df7-affb-8a40c753aaaa" (UID: "a589c8ef-17db-4df7-affb-8a40c753aaaa"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.371977 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "a589c8ef-17db-4df7-affb-8a40c753aaaa" (UID: "a589c8ef-17db-4df7-affb-8a40c753aaaa"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.372181 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmlfq\" (UniqueName: \"kubernetes.io/projected/a589c8ef-17db-4df7-affb-8a40c753aaaa-kube-api-access-qmlfq\") pod \"a589c8ef-17db-4df7-affb-8a40c753aaaa\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.372220 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-user-template-error\") pod \"a589c8ef-17db-4df7-affb-8a40c753aaaa\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.372280 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-trusted-ca-bundle\") pod \"a589c8ef-17db-4df7-affb-8a40c753aaaa\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.372326 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-ocp-branding-template\") pod \"a589c8ef-17db-4df7-affb-8a40c753aaaa\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.372360 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-user-template-login\") pod \"a589c8ef-17db-4df7-affb-8a40c753aaaa\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.372376 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a589c8ef-17db-4df7-affb-8a40c753aaaa-audit-policies\") pod \"a589c8ef-17db-4df7-affb-8a40c753aaaa\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.372394 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-user-template-provider-selection\") pod \"a589c8ef-17db-4df7-affb-8a40c753aaaa\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.372427 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-router-certs\") pod \"a589c8ef-17db-4df7-affb-8a40c753aaaa\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.372455 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-cliconfig\") pod \"a589c8ef-17db-4df7-affb-8a40c753aaaa\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.372475 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-session\") pod \"a589c8ef-17db-4df7-affb-8a40c753aaaa\" (UID: \"a589c8ef-17db-4df7-affb-8a40c753aaaa\") " Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.372686 4696 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a589c8ef-17db-4df7-affb-8a40c753aaaa-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.372703 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.373576 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a589c8ef-17db-4df7-affb-8a40c753aaaa-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "a589c8ef-17db-4df7-affb-8a40c753aaaa" (UID: "a589c8ef-17db-4df7-affb-8a40c753aaaa"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.375257 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "a589c8ef-17db-4df7-affb-8a40c753aaaa" (UID: "a589c8ef-17db-4df7-affb-8a40c753aaaa"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.376190 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "a589c8ef-17db-4df7-affb-8a40c753aaaa" (UID: "a589c8ef-17db-4df7-affb-8a40c753aaaa"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.377479 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "a589c8ef-17db-4df7-affb-8a40c753aaaa" (UID: "a589c8ef-17db-4df7-affb-8a40c753aaaa"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.377591 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a589c8ef-17db-4df7-affb-8a40c753aaaa-kube-api-access-qmlfq" (OuterVolumeSpecName: "kube-api-access-qmlfq") pod "a589c8ef-17db-4df7-affb-8a40c753aaaa" (UID: "a589c8ef-17db-4df7-affb-8a40c753aaaa"). InnerVolumeSpecName "kube-api-access-qmlfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.377621 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "a589c8ef-17db-4df7-affb-8a40c753aaaa" (UID: "a589c8ef-17db-4df7-affb-8a40c753aaaa"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.377838 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "a589c8ef-17db-4df7-affb-8a40c753aaaa" (UID: "a589c8ef-17db-4df7-affb-8a40c753aaaa"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.380774 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "a589c8ef-17db-4df7-affb-8a40c753aaaa" (UID: "a589c8ef-17db-4df7-affb-8a40c753aaaa"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.384309 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "a589c8ef-17db-4df7-affb-8a40c753aaaa" (UID: "a589c8ef-17db-4df7-affb-8a40c753aaaa"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.384693 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "a589c8ef-17db-4df7-affb-8a40c753aaaa" (UID: "a589c8ef-17db-4df7-affb-8a40c753aaaa"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.385821 4696 scope.go:117] "RemoveContainer" containerID="7912e162910f43ef4c70d5a9890c1141577da6d18a02aac38dc12be72b5ce356" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.385874 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "a589c8ef-17db-4df7-affb-8a40c753aaaa" (UID: "a589c8ef-17db-4df7-affb-8a40c753aaaa"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.387889 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "a589c8ef-17db-4df7-affb-8a40c753aaaa" (UID: "a589c8ef-17db-4df7-affb-8a40c753aaaa"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.418301 4696 scope.go:117] "RemoveContainer" containerID="87f868d2e0b51e30a352e6674a93813e1cf42c1fb914fc6fa6f44cc5e05c9d78" Mar 18 15:41:52 crc kubenswrapper[4696]: E0318 15:41:52.418767 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87f868d2e0b51e30a352e6674a93813e1cf42c1fb914fc6fa6f44cc5e05c9d78\": container with ID starting with 87f868d2e0b51e30a352e6674a93813e1cf42c1fb914fc6fa6f44cc5e05c9d78 not found: ID does not exist" containerID="87f868d2e0b51e30a352e6674a93813e1cf42c1fb914fc6fa6f44cc5e05c9d78" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.418811 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87f868d2e0b51e30a352e6674a93813e1cf42c1fb914fc6fa6f44cc5e05c9d78"} err="failed to get container status \"87f868d2e0b51e30a352e6674a93813e1cf42c1fb914fc6fa6f44cc5e05c9d78\": rpc error: code = NotFound desc = could not find container \"87f868d2e0b51e30a352e6674a93813e1cf42c1fb914fc6fa6f44cc5e05c9d78\": container with ID starting with 87f868d2e0b51e30a352e6674a93813e1cf42c1fb914fc6fa6f44cc5e05c9d78 not found: ID does not exist" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.418840 4696 scope.go:117] "RemoveContainer" containerID="320f3e588eb26897c29304c277c1200a3efd1ba29ef03a63bd342c8457d80b1e" Mar 18 15:41:52 crc kubenswrapper[4696]: E0318 15:41:52.419099 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"320f3e588eb26897c29304c277c1200a3efd1ba29ef03a63bd342c8457d80b1e\": container with ID starting with 320f3e588eb26897c29304c277c1200a3efd1ba29ef03a63bd342c8457d80b1e not found: ID does not exist" containerID="320f3e588eb26897c29304c277c1200a3efd1ba29ef03a63bd342c8457d80b1e" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.419123 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320f3e588eb26897c29304c277c1200a3efd1ba29ef03a63bd342c8457d80b1e"} err="failed to get container status \"320f3e588eb26897c29304c277c1200a3efd1ba29ef03a63bd342c8457d80b1e\": rpc error: code = NotFound desc = could not find container \"320f3e588eb26897c29304c277c1200a3efd1ba29ef03a63bd342c8457d80b1e\": container with ID starting with 320f3e588eb26897c29304c277c1200a3efd1ba29ef03a63bd342c8457d80b1e not found: ID does not exist" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.419137 4696 scope.go:117] "RemoveContainer" containerID="7912e162910f43ef4c70d5a9890c1141577da6d18a02aac38dc12be72b5ce356" Mar 18 15:41:52 crc kubenswrapper[4696]: E0318 15:41:52.419333 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7912e162910f43ef4c70d5a9890c1141577da6d18a02aac38dc12be72b5ce356\": container with ID starting with 7912e162910f43ef4c70d5a9890c1141577da6d18a02aac38dc12be72b5ce356 not found: ID does not exist" containerID="7912e162910f43ef4c70d5a9890c1141577da6d18a02aac38dc12be72b5ce356" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.419354 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7912e162910f43ef4c70d5a9890c1141577da6d18a02aac38dc12be72b5ce356"} err="failed to get container status \"7912e162910f43ef4c70d5a9890c1141577da6d18a02aac38dc12be72b5ce356\": rpc error: code = NotFound desc = could not find container \"7912e162910f43ef4c70d5a9890c1141577da6d18a02aac38dc12be72b5ce356\": container with ID starting with 7912e162910f43ef4c70d5a9890c1141577da6d18a02aac38dc12be72b5ce356 not found: ID does not exist" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.419368 4696 scope.go:117] "RemoveContainer" containerID="3b8b6835ca133944604761dba0aca9641f0ae62efd6047b4eb321034ffac36b7" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.441800 4696 scope.go:117] "RemoveContainer" containerID="3b8b6835ca133944604761dba0aca9641f0ae62efd6047b4eb321034ffac36b7" Mar 18 15:41:52 crc kubenswrapper[4696]: E0318 15:41:52.442405 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b8b6835ca133944604761dba0aca9641f0ae62efd6047b4eb321034ffac36b7\": container with ID starting with 3b8b6835ca133944604761dba0aca9641f0ae62efd6047b4eb321034ffac36b7 not found: ID does not exist" containerID="3b8b6835ca133944604761dba0aca9641f0ae62efd6047b4eb321034ffac36b7" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.442458 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b8b6835ca133944604761dba0aca9641f0ae62efd6047b4eb321034ffac36b7"} err="failed to get container status \"3b8b6835ca133944604761dba0aca9641f0ae62efd6047b4eb321034ffac36b7\": rpc error: code = NotFound desc = could not find container \"3b8b6835ca133944604761dba0aca9641f0ae62efd6047b4eb321034ffac36b7\": container with ID starting with 3b8b6835ca133944604761dba0aca9641f0ae62efd6047b4eb321034ffac36b7 not found: ID does not exist" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.473731 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.473766 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.473778 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.473790 4696 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a589c8ef-17db-4df7-affb-8a40c753aaaa-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.473801 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.473812 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.473821 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.473830 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.473838 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.473849 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.473858 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmlfq\" (UniqueName: \"kubernetes.io/projected/a589c8ef-17db-4df7-affb-8a40c753aaaa-kube-api-access-qmlfq\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.473867 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a589c8ef-17db-4df7-affb-8a40c753aaaa-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.662008 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-485wk"] Mar 18 15:41:52 crc kubenswrapper[4696]: I0318 15:41:52.667245 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-485wk"] Mar 18 15:41:53 crc kubenswrapper[4696]: I0318 15:41:53.603753 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a589c8ef-17db-4df7-affb-8a40c753aaaa" path="/var/lib/kubelet/pods/a589c8ef-17db-4df7-affb-8a40c753aaaa/volumes" Mar 18 15:41:53 crc kubenswrapper[4696]: I0318 15:41:53.604350 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfaa1769-2588-440e-9757-ded58dcb0ac3" path="/var/lib/kubelet/pods/cfaa1769-2588-440e-9757-ded58dcb0ac3/volumes" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.955670 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm"] Mar 18 15:41:54 crc kubenswrapper[4696]: E0318 15:41:54.955883 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="459c4b74-f710-4b3a-b053-8b0326b87cb5" containerName="registry-server" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.955894 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="459c4b74-f710-4b3a-b053-8b0326b87cb5" containerName="registry-server" Mar 18 15:41:54 crc kubenswrapper[4696]: E0318 15:41:54.955904 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="459c4b74-f710-4b3a-b053-8b0326b87cb5" containerName="extract-content" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.955910 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="459c4b74-f710-4b3a-b053-8b0326b87cb5" containerName="extract-content" Mar 18 15:41:54 crc kubenswrapper[4696]: E0318 15:41:54.955918 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" containerName="registry-server" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.955924 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" containerName="registry-server" Mar 18 15:41:54 crc kubenswrapper[4696]: E0318 15:41:54.955935 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" containerName="extract-utilities" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.955941 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" containerName="extract-utilities" Mar 18 15:41:54 crc kubenswrapper[4696]: E0318 15:41:54.955952 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="459c4b74-f710-4b3a-b053-8b0326b87cb5" containerName="extract-utilities" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.955958 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="459c4b74-f710-4b3a-b053-8b0326b87cb5" containerName="extract-utilities" Mar 18 15:41:54 crc kubenswrapper[4696]: E0318 15:41:54.955967 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfaa1769-2588-440e-9757-ded58dcb0ac3" containerName="registry-server" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.955973 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfaa1769-2588-440e-9757-ded58dcb0ac3" containerName="registry-server" Mar 18 15:41:54 crc kubenswrapper[4696]: E0318 15:41:54.955984 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfaa1769-2588-440e-9757-ded58dcb0ac3" containerName="extract-utilities" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.955990 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfaa1769-2588-440e-9757-ded58dcb0ac3" containerName="extract-utilities" Mar 18 15:41:54 crc kubenswrapper[4696]: E0318 15:41:54.955997 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" containerName="extract-content" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.956002 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" containerName="extract-content" Mar 18 15:41:54 crc kubenswrapper[4696]: E0318 15:41:54.956009 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfaa1769-2588-440e-9757-ded58dcb0ac3" containerName="extract-content" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.956016 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfaa1769-2588-440e-9757-ded58dcb0ac3" containerName="extract-content" Mar 18 15:41:54 crc kubenswrapper[4696]: E0318 15:41:54.956025 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.956030 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 15:41:54 crc kubenswrapper[4696]: E0318 15:41:54.956037 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a589c8ef-17db-4df7-affb-8a40c753aaaa" containerName="oauth-openshift" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.956044 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a589c8ef-17db-4df7-affb-8a40c753aaaa" containerName="oauth-openshift" Mar 18 15:41:54 crc kubenswrapper[4696]: E0318 15:41:54.956054 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70ccac16-ae70-4ba3-9edf-76707d5643b1" containerName="installer" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.956061 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="70ccac16-ae70-4ba3-9edf-76707d5643b1" containerName="installer" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.956163 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0569eea-b948-4633-91d7-4ebfa02d5a8b" containerName="registry-server" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.956176 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="70ccac16-ae70-4ba3-9edf-76707d5643b1" containerName="installer" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.956182 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfaa1769-2588-440e-9757-ded58dcb0ac3" containerName="registry-server" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.956191 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="459c4b74-f710-4b3a-b053-8b0326b87cb5" containerName="registry-server" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.956200 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a589c8ef-17db-4df7-affb-8a40c753aaaa" containerName="oauth-openshift" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.956208 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.956589 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.959075 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.959441 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.960039 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.960205 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.960053 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.960233 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.960231 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.961356 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.962815 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.963477 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.963703 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.964062 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.976051 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.979317 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.982558 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm"] Mar 18 15:41:54 crc kubenswrapper[4696]: I0318 15:41:54.996854 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.103448 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.103501 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.103552 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-system-router-certs\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.103600 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-system-service-ca\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.103641 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-system-session\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.103682 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.103712 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-user-template-error\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.103745 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/78016513-52fe-4877-8e09-bcc184fd645d-audit-policies\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.103766 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.103817 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-user-template-login\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.103879 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/78016513-52fe-4877-8e09-bcc184fd645d-audit-dir\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.103907 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.103935 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqz78\" (UniqueName: \"kubernetes.io/projected/78016513-52fe-4877-8e09-bcc184fd645d-kube-api-access-sqz78\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.104055 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.204924 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-user-template-error\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.205037 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/78016513-52fe-4877-8e09-bcc184fd645d-audit-policies\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.205062 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.205093 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-user-template-login\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.205120 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/78016513-52fe-4877-8e09-bcc184fd645d-audit-dir\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.205145 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.205170 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqz78\" (UniqueName: \"kubernetes.io/projected/78016513-52fe-4877-8e09-bcc184fd645d-kube-api-access-sqz78\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.205196 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.205238 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.205264 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.205285 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-system-router-certs\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.205308 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-system-service-ca\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.205330 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-system-session\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.205363 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.206047 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/78016513-52fe-4877-8e09-bcc184fd645d-audit-policies\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.206451 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/78016513-52fe-4877-8e09-bcc184fd645d-audit-dir\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.206766 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.206976 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-system-service-ca\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.207318 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.213054 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-system-session\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.213085 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-system-router-certs\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.213118 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-user-template-login\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.213551 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.213844 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.213967 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-user-template-error\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.214024 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.215783 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/78016513-52fe-4877-8e09-bcc184fd645d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.225685 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqz78\" (UniqueName: \"kubernetes.io/projected/78016513-52fe-4877-8e09-bcc184fd645d-kube-api-access-sqz78\") pod \"oauth-openshift-68b6cf6df7-vnhpm\" (UID: \"78016513-52fe-4877-8e09-bcc184fd645d\") " pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.272501 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:55 crc kubenswrapper[4696]: I0318 15:41:55.704335 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm"] Mar 18 15:41:56 crc kubenswrapper[4696]: I0318 15:41:56.353559 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" event={"ID":"78016513-52fe-4877-8e09-bcc184fd645d","Type":"ContainerStarted","Data":"50edce667da65abcba18132257d61463a8c00503333cd8ff36b82fb286ebdf62"} Mar 18 15:41:56 crc kubenswrapper[4696]: I0318 15:41:56.353908 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:41:56 crc kubenswrapper[4696]: I0318 15:41:56.353922 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" event={"ID":"78016513-52fe-4877-8e09-bcc184fd645d","Type":"ContainerStarted","Data":"18ca791858fc8e419a5f116e4cac2f777e7662072414c34f72fe40468aaa1413"} Mar 18 15:41:56 crc kubenswrapper[4696]: I0318 15:41:56.373457 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" podStartSLOduration=30.373437771 podStartE2EDuration="30.373437771s" podCreationTimestamp="2026-03-18 15:41:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:41:56.370837593 +0000 UTC m=+359.377011819" watchObservedRunningTime="2026-03-18 15:41:56.373437771 +0000 UTC m=+359.379611977" Mar 18 15:41:56 crc kubenswrapper[4696]: I0318 15:41:56.832745 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-68b6cf6df7-vnhpm" Mar 18 15:42:00 crc kubenswrapper[4696]: I0318 15:42:00.162059 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564142-p5rzg"] Mar 18 15:42:00 crc kubenswrapper[4696]: I0318 15:42:00.164007 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564142-p5rzg" Mar 18 15:42:00 crc kubenswrapper[4696]: I0318 15:42:00.166754 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:42:00 crc kubenswrapper[4696]: I0318 15:42:00.167206 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 15:42:00 crc kubenswrapper[4696]: I0318 15:42:00.167821 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:42:00 crc kubenswrapper[4696]: I0318 15:42:00.181936 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564142-p5rzg"] Mar 18 15:42:00 crc kubenswrapper[4696]: I0318 15:42:00.268086 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pgh6\" (UniqueName: \"kubernetes.io/projected/9646518f-1f05-4cb7-a921-5cd0e7af7b4b-kube-api-access-8pgh6\") pod \"auto-csr-approver-29564142-p5rzg\" (UID: \"9646518f-1f05-4cb7-a921-5cd0e7af7b4b\") " pod="openshift-infra/auto-csr-approver-29564142-p5rzg" Mar 18 15:42:00 crc kubenswrapper[4696]: I0318 15:42:00.369465 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pgh6\" (UniqueName: \"kubernetes.io/projected/9646518f-1f05-4cb7-a921-5cd0e7af7b4b-kube-api-access-8pgh6\") pod \"auto-csr-approver-29564142-p5rzg\" (UID: \"9646518f-1f05-4cb7-a921-5cd0e7af7b4b\") " pod="openshift-infra/auto-csr-approver-29564142-p5rzg" Mar 18 15:42:00 crc kubenswrapper[4696]: I0318 15:42:00.396916 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pgh6\" (UniqueName: \"kubernetes.io/projected/9646518f-1f05-4cb7-a921-5cd0e7af7b4b-kube-api-access-8pgh6\") pod \"auto-csr-approver-29564142-p5rzg\" (UID: \"9646518f-1f05-4cb7-a921-5cd0e7af7b4b\") " pod="openshift-infra/auto-csr-approver-29564142-p5rzg" Mar 18 15:42:00 crc kubenswrapper[4696]: I0318 15:42:00.485950 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564142-p5rzg" Mar 18 15:42:00 crc kubenswrapper[4696]: I0318 15:42:00.899553 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564142-p5rzg"] Mar 18 15:42:01 crc kubenswrapper[4696]: I0318 15:42:01.379768 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564142-p5rzg" event={"ID":"9646518f-1f05-4cb7-a921-5cd0e7af7b4b","Type":"ContainerStarted","Data":"dedea5b9ad3a849dbc33c35dde1df0eb85c1ee65bed58a730971e982e7ce77cf"} Mar 18 15:42:03 crc kubenswrapper[4696]: I0318 15:42:03.394785 4696 generic.go:334] "Generic (PLEG): container finished" podID="9646518f-1f05-4cb7-a921-5cd0e7af7b4b" containerID="5ffd7de8b7b7c0823b55a77f47c2d71a383752769df88c54e14f7a9eb06943c0" exitCode=0 Mar 18 15:42:03 crc kubenswrapper[4696]: I0318 15:42:03.394893 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564142-p5rzg" event={"ID":"9646518f-1f05-4cb7-a921-5cd0e7af7b4b","Type":"ContainerDied","Data":"5ffd7de8b7b7c0823b55a77f47c2d71a383752769df88c54e14f7a9eb06943c0"} Mar 18 15:42:04 crc kubenswrapper[4696]: I0318 15:42:04.654092 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564142-p5rzg" Mar 18 15:42:04 crc kubenswrapper[4696]: I0318 15:42:04.821464 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pgh6\" (UniqueName: \"kubernetes.io/projected/9646518f-1f05-4cb7-a921-5cd0e7af7b4b-kube-api-access-8pgh6\") pod \"9646518f-1f05-4cb7-a921-5cd0e7af7b4b\" (UID: \"9646518f-1f05-4cb7-a921-5cd0e7af7b4b\") " Mar 18 15:42:04 crc kubenswrapper[4696]: I0318 15:42:04.830265 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9646518f-1f05-4cb7-a921-5cd0e7af7b4b-kube-api-access-8pgh6" (OuterVolumeSpecName: "kube-api-access-8pgh6") pod "9646518f-1f05-4cb7-a921-5cd0e7af7b4b" (UID: "9646518f-1f05-4cb7-a921-5cd0e7af7b4b"). InnerVolumeSpecName "kube-api-access-8pgh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:42:04 crc kubenswrapper[4696]: I0318 15:42:04.923340 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pgh6\" (UniqueName: \"kubernetes.io/projected/9646518f-1f05-4cb7-a921-5cd0e7af7b4b-kube-api-access-8pgh6\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:05 crc kubenswrapper[4696]: I0318 15:42:05.407284 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564142-p5rzg" event={"ID":"9646518f-1f05-4cb7-a921-5cd0e7af7b4b","Type":"ContainerDied","Data":"dedea5b9ad3a849dbc33c35dde1df0eb85c1ee65bed58a730971e982e7ce77cf"} Mar 18 15:42:05 crc kubenswrapper[4696]: I0318 15:42:05.407340 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dedea5b9ad3a849dbc33c35dde1df0eb85c1ee65bed58a730971e982e7ce77cf" Mar 18 15:42:05 crc kubenswrapper[4696]: I0318 15:42:05.407373 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564142-p5rzg" Mar 18 15:42:25 crc kubenswrapper[4696]: I0318 15:42:25.869875 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wgfqb"] Mar 18 15:42:25 crc kubenswrapper[4696]: E0318 15:42:25.870476 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9646518f-1f05-4cb7-a921-5cd0e7af7b4b" containerName="oc" Mar 18 15:42:25 crc kubenswrapper[4696]: I0318 15:42:25.870488 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9646518f-1f05-4cb7-a921-5cd0e7af7b4b" containerName="oc" Mar 18 15:42:25 crc kubenswrapper[4696]: I0318 15:42:25.870592 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="9646518f-1f05-4cb7-a921-5cd0e7af7b4b" containerName="oc" Mar 18 15:42:25 crc kubenswrapper[4696]: I0318 15:42:25.870946 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:25 crc kubenswrapper[4696]: I0318 15:42:25.894825 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wgfqb"] Mar 18 15:42:25 crc kubenswrapper[4696]: I0318 15:42:25.924375 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b42c927-6912-4be3-a40e-e52643ec9137-registry-certificates\") pod \"image-registry-66df7c8f76-wgfqb\" (UID: \"1b42c927-6912-4be3-a40e-e52643ec9137\") " pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:25 crc kubenswrapper[4696]: I0318 15:42:25.924451 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlh4s\" (UniqueName: \"kubernetes.io/projected/1b42c927-6912-4be3-a40e-e52643ec9137-kube-api-access-nlh4s\") pod \"image-registry-66df7c8f76-wgfqb\" (UID: \"1b42c927-6912-4be3-a40e-e52643ec9137\") " pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:25 crc kubenswrapper[4696]: I0318 15:42:25.924492 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wgfqb\" (UID: \"1b42c927-6912-4be3-a40e-e52643ec9137\") " pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:25 crc kubenswrapper[4696]: I0318 15:42:25.924549 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b42c927-6912-4be3-a40e-e52643ec9137-bound-sa-token\") pod \"image-registry-66df7c8f76-wgfqb\" (UID: \"1b42c927-6912-4be3-a40e-e52643ec9137\") " pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:25 crc kubenswrapper[4696]: I0318 15:42:25.924574 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b42c927-6912-4be3-a40e-e52643ec9137-registry-tls\") pod \"image-registry-66df7c8f76-wgfqb\" (UID: \"1b42c927-6912-4be3-a40e-e52643ec9137\") " pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:25 crc kubenswrapper[4696]: I0318 15:42:25.924598 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b42c927-6912-4be3-a40e-e52643ec9137-trusted-ca\") pod \"image-registry-66df7c8f76-wgfqb\" (UID: \"1b42c927-6912-4be3-a40e-e52643ec9137\") " pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:25 crc kubenswrapper[4696]: I0318 15:42:25.924642 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b42c927-6912-4be3-a40e-e52643ec9137-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wgfqb\" (UID: \"1b42c927-6912-4be3-a40e-e52643ec9137\") " pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:25 crc kubenswrapper[4696]: I0318 15:42:25.924845 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b42c927-6912-4be3-a40e-e52643ec9137-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wgfqb\" (UID: \"1b42c927-6912-4be3-a40e-e52643ec9137\") " pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:25 crc kubenswrapper[4696]: I0318 15:42:25.944338 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wgfqb\" (UID: \"1b42c927-6912-4be3-a40e-e52643ec9137\") " pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:26 crc kubenswrapper[4696]: I0318 15:42:26.026502 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b42c927-6912-4be3-a40e-e52643ec9137-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wgfqb\" (UID: \"1b42c927-6912-4be3-a40e-e52643ec9137\") " pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:26 crc kubenswrapper[4696]: I0318 15:42:26.026594 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b42c927-6912-4be3-a40e-e52643ec9137-registry-certificates\") pod \"image-registry-66df7c8f76-wgfqb\" (UID: \"1b42c927-6912-4be3-a40e-e52643ec9137\") " pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:26 crc kubenswrapper[4696]: I0318 15:42:26.026649 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlh4s\" (UniqueName: \"kubernetes.io/projected/1b42c927-6912-4be3-a40e-e52643ec9137-kube-api-access-nlh4s\") pod \"image-registry-66df7c8f76-wgfqb\" (UID: \"1b42c927-6912-4be3-a40e-e52643ec9137\") " pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:26 crc kubenswrapper[4696]: I0318 15:42:26.026714 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b42c927-6912-4be3-a40e-e52643ec9137-bound-sa-token\") pod \"image-registry-66df7c8f76-wgfqb\" (UID: \"1b42c927-6912-4be3-a40e-e52643ec9137\") " pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:26 crc kubenswrapper[4696]: I0318 15:42:26.026743 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b42c927-6912-4be3-a40e-e52643ec9137-registry-tls\") pod \"image-registry-66df7c8f76-wgfqb\" (UID: \"1b42c927-6912-4be3-a40e-e52643ec9137\") " pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:26 crc kubenswrapper[4696]: I0318 15:42:26.027261 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b42c927-6912-4be3-a40e-e52643ec9137-trusted-ca\") pod \"image-registry-66df7c8f76-wgfqb\" (UID: \"1b42c927-6912-4be3-a40e-e52643ec9137\") " pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:26 crc kubenswrapper[4696]: I0318 15:42:26.027325 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b42c927-6912-4be3-a40e-e52643ec9137-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wgfqb\" (UID: \"1b42c927-6912-4be3-a40e-e52643ec9137\") " pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:26 crc kubenswrapper[4696]: I0318 15:42:26.027176 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b42c927-6912-4be3-a40e-e52643ec9137-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wgfqb\" (UID: \"1b42c927-6912-4be3-a40e-e52643ec9137\") " pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:26 crc kubenswrapper[4696]: I0318 15:42:26.028144 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b42c927-6912-4be3-a40e-e52643ec9137-registry-certificates\") pod \"image-registry-66df7c8f76-wgfqb\" (UID: \"1b42c927-6912-4be3-a40e-e52643ec9137\") " pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:26 crc kubenswrapper[4696]: I0318 15:42:26.029439 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b42c927-6912-4be3-a40e-e52643ec9137-trusted-ca\") pod \"image-registry-66df7c8f76-wgfqb\" (UID: \"1b42c927-6912-4be3-a40e-e52643ec9137\") " pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:26 crc kubenswrapper[4696]: I0318 15:42:26.033001 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b42c927-6912-4be3-a40e-e52643ec9137-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wgfqb\" (UID: \"1b42c927-6912-4be3-a40e-e52643ec9137\") " pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:26 crc kubenswrapper[4696]: I0318 15:42:26.033240 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b42c927-6912-4be3-a40e-e52643ec9137-registry-tls\") pod \"image-registry-66df7c8f76-wgfqb\" (UID: \"1b42c927-6912-4be3-a40e-e52643ec9137\") " pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:26 crc kubenswrapper[4696]: I0318 15:42:26.047401 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlh4s\" (UniqueName: \"kubernetes.io/projected/1b42c927-6912-4be3-a40e-e52643ec9137-kube-api-access-nlh4s\") pod \"image-registry-66df7c8f76-wgfqb\" (UID: \"1b42c927-6912-4be3-a40e-e52643ec9137\") " pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:26 crc kubenswrapper[4696]: I0318 15:42:26.049868 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b42c927-6912-4be3-a40e-e52643ec9137-bound-sa-token\") pod \"image-registry-66df7c8f76-wgfqb\" (UID: \"1b42c927-6912-4be3-a40e-e52643ec9137\") " pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:26 crc kubenswrapper[4696]: I0318 15:42:26.189435 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:26 crc kubenswrapper[4696]: I0318 15:42:26.397562 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wgfqb"] Mar 18 15:42:26 crc kubenswrapper[4696]: I0318 15:42:26.540551 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" event={"ID":"1b42c927-6912-4be3-a40e-e52643ec9137","Type":"ContainerStarted","Data":"f56afe8280e48bb87f5371943f79ad05fcca9c608347ddf683b7341f63250afd"} Mar 18 15:42:26 crc kubenswrapper[4696]: I0318 15:42:26.541015 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:26 crc kubenswrapper[4696]: I0318 15:42:26.541028 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" event={"ID":"1b42c927-6912-4be3-a40e-e52643ec9137","Type":"ContainerStarted","Data":"33e0dfa54176791b15f3b845010262ec8ab171acd241d180bf9746dee139b8e9"} Mar 18 15:42:26 crc kubenswrapper[4696]: I0318 15:42:26.561221 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" podStartSLOduration=1.561201723 podStartE2EDuration="1.561201723s" podCreationTimestamp="2026-03-18 15:42:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:42:26.559850368 +0000 UTC m=+389.566024574" watchObservedRunningTime="2026-03-18 15:42:26.561201723 +0000 UTC m=+389.567375929" Mar 18 15:42:39 crc kubenswrapper[4696]: I0318 15:42:39.875273 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8tkdr"] Mar 18 15:42:39 crc kubenswrapper[4696]: I0318 15:42:39.875880 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8tkdr" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" containerName="registry-server" containerID="cri-o://7a7de1246e6e5bff29d1523d9413ddf3c1548e54f064c428f3f6b49119a36f79" gracePeriod=30 Mar 18 15:42:39 crc kubenswrapper[4696]: I0318 15:42:39.879360 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xd5v7"] Mar 18 15:42:39 crc kubenswrapper[4696]: I0318 15:42:39.879655 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xd5v7" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" containerName="registry-server" containerID="cri-o://d5fae5a1b8c50b5b00df7d9c7769430f8c57dcdd7b6c1a1483f723a168ab920b" gracePeriod=30 Mar 18 15:42:39 crc kubenswrapper[4696]: I0318 15:42:39.883192 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r4wjj"] Mar 18 15:42:39 crc kubenswrapper[4696]: I0318 15:42:39.883410 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-r4wjj" podUID="e4224061-726c-4bee-84d0-9b1dfddcdaa4" containerName="marketplace-operator" containerID="cri-o://ea5930774ab4c5fe5ce9ef076acc7e45c37308db4c11faee50be36a955bf26d5" gracePeriod=30 Mar 18 15:42:39 crc kubenswrapper[4696]: I0318 15:42:39.888478 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nsq74"] Mar 18 15:42:39 crc kubenswrapper[4696]: I0318 15:42:39.888721 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nsq74" podUID="c9ddf1c7-0e1d-4f54-a93d-1665148569b2" containerName="registry-server" containerID="cri-o://4ac84bc9741d1fba5138ba79059a6ed4b65e1c460669ecbf4510c7969d5913a2" gracePeriod=30 Mar 18 15:42:39 crc kubenswrapper[4696]: I0318 15:42:39.903382 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6p846"] Mar 18 15:42:39 crc kubenswrapper[4696]: I0318 15:42:39.903787 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6p846" podUID="80160993-15c5-4eea-ac72-2094fe935ac1" containerName="registry-server" containerID="cri-o://944127e4043f6ef58f8f14c7dbfda1111f31e8774ec4c3ade3f8cd7edf82fb28" gracePeriod=30 Mar 18 15:42:39 crc kubenswrapper[4696]: I0318 15:42:39.910659 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-v84rb"] Mar 18 15:42:39 crc kubenswrapper[4696]: I0318 15:42:39.911456 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-v84rb" Mar 18 15:42:39 crc kubenswrapper[4696]: I0318 15:42:39.925343 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-v84rb"] Mar 18 15:42:39 crc kubenswrapper[4696]: I0318 15:42:39.963786 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/71a9fceb-5471-42f1-867d-28f7196daf81-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-v84rb\" (UID: \"71a9fceb-5471-42f1-867d-28f7196daf81\") " pod="openshift-marketplace/marketplace-operator-79b997595-v84rb" Mar 18 15:42:39 crc kubenswrapper[4696]: I0318 15:42:39.964096 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/71a9fceb-5471-42f1-867d-28f7196daf81-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-v84rb\" (UID: \"71a9fceb-5471-42f1-867d-28f7196daf81\") " pod="openshift-marketplace/marketplace-operator-79b997595-v84rb" Mar 18 15:42:39 crc kubenswrapper[4696]: I0318 15:42:39.964127 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgndn\" (UniqueName: \"kubernetes.io/projected/71a9fceb-5471-42f1-867d-28f7196daf81-kube-api-access-cgndn\") pod \"marketplace-operator-79b997595-v84rb\" (UID: \"71a9fceb-5471-42f1-867d-28f7196daf81\") " pod="openshift-marketplace/marketplace-operator-79b997595-v84rb" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.065557 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/71a9fceb-5471-42f1-867d-28f7196daf81-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-v84rb\" (UID: \"71a9fceb-5471-42f1-867d-28f7196daf81\") " pod="openshift-marketplace/marketplace-operator-79b997595-v84rb" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.065932 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgndn\" (UniqueName: \"kubernetes.io/projected/71a9fceb-5471-42f1-867d-28f7196daf81-kube-api-access-cgndn\") pod \"marketplace-operator-79b997595-v84rb\" (UID: \"71a9fceb-5471-42f1-867d-28f7196daf81\") " pod="openshift-marketplace/marketplace-operator-79b997595-v84rb" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.066003 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/71a9fceb-5471-42f1-867d-28f7196daf81-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-v84rb\" (UID: \"71a9fceb-5471-42f1-867d-28f7196daf81\") " pod="openshift-marketplace/marketplace-operator-79b997595-v84rb" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.067326 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/71a9fceb-5471-42f1-867d-28f7196daf81-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-v84rb\" (UID: \"71a9fceb-5471-42f1-867d-28f7196daf81\") " pod="openshift-marketplace/marketplace-operator-79b997595-v84rb" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.072232 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/71a9fceb-5471-42f1-867d-28f7196daf81-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-v84rb\" (UID: \"71a9fceb-5471-42f1-867d-28f7196daf81\") " pod="openshift-marketplace/marketplace-operator-79b997595-v84rb" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.088079 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgndn\" (UniqueName: \"kubernetes.io/projected/71a9fceb-5471-42f1-867d-28f7196daf81-kube-api-access-cgndn\") pod \"marketplace-operator-79b997595-v84rb\" (UID: \"71a9fceb-5471-42f1-867d-28f7196daf81\") " pod="openshift-marketplace/marketplace-operator-79b997595-v84rb" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.232150 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-v84rb" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.351177 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r4wjj" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.354285 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8tkdr" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.372385 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e4224061-726c-4bee-84d0-9b1dfddcdaa4-marketplace-trusted-ca\") pod \"e4224061-726c-4bee-84d0-9b1dfddcdaa4\" (UID: \"e4224061-726c-4bee-84d0-9b1dfddcdaa4\") " Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.372557 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e22cc1d-032f-4f3a-a0ca-51708beef610-catalog-content\") pod \"4e22cc1d-032f-4f3a-a0ca-51708beef610\" (UID: \"4e22cc1d-032f-4f3a-a0ca-51708beef610\") " Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.372597 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e4224061-726c-4bee-84d0-9b1dfddcdaa4-marketplace-operator-metrics\") pod \"e4224061-726c-4bee-84d0-9b1dfddcdaa4\" (UID: \"e4224061-726c-4bee-84d0-9b1dfddcdaa4\") " Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.372640 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e22cc1d-032f-4f3a-a0ca-51708beef610-utilities\") pod \"4e22cc1d-032f-4f3a-a0ca-51708beef610\" (UID: \"4e22cc1d-032f-4f3a-a0ca-51708beef610\") " Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.372809 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2bmg\" (UniqueName: \"kubernetes.io/projected/e4224061-726c-4bee-84d0-9b1dfddcdaa4-kube-api-access-f2bmg\") pod \"e4224061-726c-4bee-84d0-9b1dfddcdaa4\" (UID: \"e4224061-726c-4bee-84d0-9b1dfddcdaa4\") " Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.378377 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tz9b\" (UniqueName: \"kubernetes.io/projected/4e22cc1d-032f-4f3a-a0ca-51708beef610-kube-api-access-2tz9b\") pod \"4e22cc1d-032f-4f3a-a0ca-51708beef610\" (UID: \"4e22cc1d-032f-4f3a-a0ca-51708beef610\") " Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.378994 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e22cc1d-032f-4f3a-a0ca-51708beef610-utilities" (OuterVolumeSpecName: "utilities") pod "4e22cc1d-032f-4f3a-a0ca-51708beef610" (UID: "4e22cc1d-032f-4f3a-a0ca-51708beef610"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.379657 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4224061-726c-4bee-84d0-9b1dfddcdaa4-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "e4224061-726c-4bee-84d0-9b1dfddcdaa4" (UID: "e4224061-726c-4bee-84d0-9b1dfddcdaa4"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.387204 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4224061-726c-4bee-84d0-9b1dfddcdaa4-kube-api-access-f2bmg" (OuterVolumeSpecName: "kube-api-access-f2bmg") pod "e4224061-726c-4bee-84d0-9b1dfddcdaa4" (UID: "e4224061-726c-4bee-84d0-9b1dfddcdaa4"). InnerVolumeSpecName "kube-api-access-f2bmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.387399 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e22cc1d-032f-4f3a-a0ca-51708beef610-kube-api-access-2tz9b" (OuterVolumeSpecName: "kube-api-access-2tz9b") pod "4e22cc1d-032f-4f3a-a0ca-51708beef610" (UID: "4e22cc1d-032f-4f3a-a0ca-51708beef610"). InnerVolumeSpecName "kube-api-access-2tz9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.389276 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4224061-726c-4bee-84d0-9b1dfddcdaa4-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "e4224061-726c-4bee-84d0-9b1dfddcdaa4" (UID: "e4224061-726c-4bee-84d0-9b1dfddcdaa4"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.393138 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nsq74" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.420012 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xd5v7" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.428541 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6p846" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.468267 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e22cc1d-032f-4f3a-a0ca-51708beef610-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e22cc1d-032f-4f3a-a0ca-51708beef610" (UID: "4e22cc1d-032f-4f3a-a0ca-51708beef610"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.480188 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80160993-15c5-4eea-ac72-2094fe935ac1-catalog-content\") pod \"80160993-15c5-4eea-ac72-2094fe935ac1\" (UID: \"80160993-15c5-4eea-ac72-2094fe935ac1\") " Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.480243 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa06cecb-1f9f-431d-933f-0e87033cd695-catalog-content\") pod \"aa06cecb-1f9f-431d-933f-0e87033cd695\" (UID: \"aa06cecb-1f9f-431d-933f-0e87033cd695\") " Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.480286 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80160993-15c5-4eea-ac72-2094fe935ac1-utilities\") pod \"80160993-15c5-4eea-ac72-2094fe935ac1\" (UID: \"80160993-15c5-4eea-ac72-2094fe935ac1\") " Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.480355 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fczjg\" (UniqueName: \"kubernetes.io/projected/80160993-15c5-4eea-ac72-2094fe935ac1-kube-api-access-fczjg\") pod \"80160993-15c5-4eea-ac72-2094fe935ac1\" (UID: \"80160993-15c5-4eea-ac72-2094fe935ac1\") " Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.480380 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27rp6\" (UniqueName: \"kubernetes.io/projected/aa06cecb-1f9f-431d-933f-0e87033cd695-kube-api-access-27rp6\") pod \"aa06cecb-1f9f-431d-933f-0e87033cd695\" (UID: \"aa06cecb-1f9f-431d-933f-0e87033cd695\") " Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.480422 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ddf1c7-0e1d-4f54-a93d-1665148569b2-utilities\") pod \"c9ddf1c7-0e1d-4f54-a93d-1665148569b2\" (UID: \"c9ddf1c7-0e1d-4f54-a93d-1665148569b2\") " Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.480459 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87qhb\" (UniqueName: \"kubernetes.io/projected/c9ddf1c7-0e1d-4f54-a93d-1665148569b2-kube-api-access-87qhb\") pod \"c9ddf1c7-0e1d-4f54-a93d-1665148569b2\" (UID: \"c9ddf1c7-0e1d-4f54-a93d-1665148569b2\") " Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.480478 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ddf1c7-0e1d-4f54-a93d-1665148569b2-catalog-content\") pod \"c9ddf1c7-0e1d-4f54-a93d-1665148569b2\" (UID: \"c9ddf1c7-0e1d-4f54-a93d-1665148569b2\") " Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.480509 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa06cecb-1f9f-431d-933f-0e87033cd695-utilities\") pod \"aa06cecb-1f9f-431d-933f-0e87033cd695\" (UID: \"aa06cecb-1f9f-431d-933f-0e87033cd695\") " Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.480809 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e22cc1d-032f-4f3a-a0ca-51708beef610-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.480832 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2bmg\" (UniqueName: \"kubernetes.io/projected/e4224061-726c-4bee-84d0-9b1dfddcdaa4-kube-api-access-f2bmg\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.480845 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tz9b\" (UniqueName: \"kubernetes.io/projected/4e22cc1d-032f-4f3a-a0ca-51708beef610-kube-api-access-2tz9b\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.480855 4696 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e4224061-726c-4bee-84d0-9b1dfddcdaa4-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.480868 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e22cc1d-032f-4f3a-a0ca-51708beef610-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.480878 4696 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e4224061-726c-4bee-84d0-9b1dfddcdaa4-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.481440 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80160993-15c5-4eea-ac72-2094fe935ac1-utilities" (OuterVolumeSpecName: "utilities") pod "80160993-15c5-4eea-ac72-2094fe935ac1" (UID: "80160993-15c5-4eea-ac72-2094fe935ac1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.481573 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa06cecb-1f9f-431d-933f-0e87033cd695-utilities" (OuterVolumeSpecName: "utilities") pod "aa06cecb-1f9f-431d-933f-0e87033cd695" (UID: "aa06cecb-1f9f-431d-933f-0e87033cd695"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.482452 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9ddf1c7-0e1d-4f54-a93d-1665148569b2-utilities" (OuterVolumeSpecName: "utilities") pod "c9ddf1c7-0e1d-4f54-a93d-1665148569b2" (UID: "c9ddf1c7-0e1d-4f54-a93d-1665148569b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.484675 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9ddf1c7-0e1d-4f54-a93d-1665148569b2-kube-api-access-87qhb" (OuterVolumeSpecName: "kube-api-access-87qhb") pod "c9ddf1c7-0e1d-4f54-a93d-1665148569b2" (UID: "c9ddf1c7-0e1d-4f54-a93d-1665148569b2"). InnerVolumeSpecName "kube-api-access-87qhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.486051 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80160993-15c5-4eea-ac72-2094fe935ac1-kube-api-access-fczjg" (OuterVolumeSpecName: "kube-api-access-fczjg") pod "80160993-15c5-4eea-ac72-2094fe935ac1" (UID: "80160993-15c5-4eea-ac72-2094fe935ac1"). InnerVolumeSpecName "kube-api-access-fczjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.488354 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa06cecb-1f9f-431d-933f-0e87033cd695-kube-api-access-27rp6" (OuterVolumeSpecName: "kube-api-access-27rp6") pod "aa06cecb-1f9f-431d-933f-0e87033cd695" (UID: "aa06cecb-1f9f-431d-933f-0e87033cd695"). InnerVolumeSpecName "kube-api-access-27rp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.517049 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9ddf1c7-0e1d-4f54-a93d-1665148569b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9ddf1c7-0e1d-4f54-a93d-1665148569b2" (UID: "c9ddf1c7-0e1d-4f54-a93d-1665148569b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.531810 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa06cecb-1f9f-431d-933f-0e87033cd695-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa06cecb-1f9f-431d-933f-0e87033cd695" (UID: "aa06cecb-1f9f-431d-933f-0e87033cd695"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.582011 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa06cecb-1f9f-431d-933f-0e87033cd695-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.582061 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80160993-15c5-4eea-ac72-2094fe935ac1-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.582083 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fczjg\" (UniqueName: \"kubernetes.io/projected/80160993-15c5-4eea-ac72-2094fe935ac1-kube-api-access-fczjg\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.582108 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27rp6\" (UniqueName: \"kubernetes.io/projected/aa06cecb-1f9f-431d-933f-0e87033cd695-kube-api-access-27rp6\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.582128 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ddf1c7-0e1d-4f54-a93d-1665148569b2-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.582147 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87qhb\" (UniqueName: \"kubernetes.io/projected/c9ddf1c7-0e1d-4f54-a93d-1665148569b2-kube-api-access-87qhb\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.582170 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ddf1c7-0e1d-4f54-a93d-1665148569b2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.582192 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa06cecb-1f9f-431d-933f-0e87033cd695-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.624173 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80160993-15c5-4eea-ac72-2094fe935ac1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "80160993-15c5-4eea-ac72-2094fe935ac1" (UID: "80160993-15c5-4eea-ac72-2094fe935ac1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.631364 4696 generic.go:334] "Generic (PLEG): container finished" podID="e4224061-726c-4bee-84d0-9b1dfddcdaa4" containerID="ea5930774ab4c5fe5ce9ef076acc7e45c37308db4c11faee50be36a955bf26d5" exitCode=0 Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.631443 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r4wjj" event={"ID":"e4224061-726c-4bee-84d0-9b1dfddcdaa4","Type":"ContainerDied","Data":"ea5930774ab4c5fe5ce9ef076acc7e45c37308db4c11faee50be36a955bf26d5"} Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.631479 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r4wjj" event={"ID":"e4224061-726c-4bee-84d0-9b1dfddcdaa4","Type":"ContainerDied","Data":"2f515891383010cd584ca85dec8eef1b0cf4225c35ee686fb45cb40f7bd222da"} Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.631504 4696 scope.go:117] "RemoveContainer" containerID="ea5930774ab4c5fe5ce9ef076acc7e45c37308db4c11faee50be36a955bf26d5" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.631672 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r4wjj" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.635783 4696 generic.go:334] "Generic (PLEG): container finished" podID="aa06cecb-1f9f-431d-933f-0e87033cd695" containerID="d5fae5a1b8c50b5b00df7d9c7769430f8c57dcdd7b6c1a1483f723a168ab920b" exitCode=0 Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.635975 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xd5v7" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.636040 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xd5v7" event={"ID":"aa06cecb-1f9f-431d-933f-0e87033cd695","Type":"ContainerDied","Data":"d5fae5a1b8c50b5b00df7d9c7769430f8c57dcdd7b6c1a1483f723a168ab920b"} Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.636199 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xd5v7" event={"ID":"aa06cecb-1f9f-431d-933f-0e87033cd695","Type":"ContainerDied","Data":"b65cf999813e5f1c1926a5cafce52cef7f2938456f73d425a0af85b4b47e91da"} Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.640955 4696 generic.go:334] "Generic (PLEG): container finished" podID="4e22cc1d-032f-4f3a-a0ca-51708beef610" containerID="7a7de1246e6e5bff29d1523d9413ddf3c1548e54f064c428f3f6b49119a36f79" exitCode=0 Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.641056 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tkdr" event={"ID":"4e22cc1d-032f-4f3a-a0ca-51708beef610","Type":"ContainerDied","Data":"7a7de1246e6e5bff29d1523d9413ddf3c1548e54f064c428f3f6b49119a36f79"} Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.641090 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tkdr" event={"ID":"4e22cc1d-032f-4f3a-a0ca-51708beef610","Type":"ContainerDied","Data":"f86dd1e0d8c1a177336ea59b01846ab0224fde14020f614154829cef7ae66e49"} Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.641166 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8tkdr" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.643252 4696 generic.go:334] "Generic (PLEG): container finished" podID="c9ddf1c7-0e1d-4f54-a93d-1665148569b2" containerID="4ac84bc9741d1fba5138ba79059a6ed4b65e1c460669ecbf4510c7969d5913a2" exitCode=0 Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.643309 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nsq74" event={"ID":"c9ddf1c7-0e1d-4f54-a93d-1665148569b2","Type":"ContainerDied","Data":"4ac84bc9741d1fba5138ba79059a6ed4b65e1c460669ecbf4510c7969d5913a2"} Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.643331 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nsq74" event={"ID":"c9ddf1c7-0e1d-4f54-a93d-1665148569b2","Type":"ContainerDied","Data":"5485c9cccc41913c6de6b33f3ff60bb9a62381c9b21bd31f04b4c0cd70f89dd3"} Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.643385 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nsq74" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.649577 4696 generic.go:334] "Generic (PLEG): container finished" podID="80160993-15c5-4eea-ac72-2094fe935ac1" containerID="944127e4043f6ef58f8f14c7dbfda1111f31e8774ec4c3ade3f8cd7edf82fb28" exitCode=0 Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.649649 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p846" event={"ID":"80160993-15c5-4eea-ac72-2094fe935ac1","Type":"ContainerDied","Data":"944127e4043f6ef58f8f14c7dbfda1111f31e8774ec4c3ade3f8cd7edf82fb28"} Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.649675 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p846" event={"ID":"80160993-15c5-4eea-ac72-2094fe935ac1","Type":"ContainerDied","Data":"0de4735b60cbe9c198533dd81efe1303a5220c5d6109bfd2e986ca272ad80608"} Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.649735 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6p846" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.659620 4696 scope.go:117] "RemoveContainer" containerID="ea5930774ab4c5fe5ce9ef076acc7e45c37308db4c11faee50be36a955bf26d5" Mar 18 15:42:40 crc kubenswrapper[4696]: E0318 15:42:40.660203 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea5930774ab4c5fe5ce9ef076acc7e45c37308db4c11faee50be36a955bf26d5\": container with ID starting with ea5930774ab4c5fe5ce9ef076acc7e45c37308db4c11faee50be36a955bf26d5 not found: ID does not exist" containerID="ea5930774ab4c5fe5ce9ef076acc7e45c37308db4c11faee50be36a955bf26d5" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.660241 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea5930774ab4c5fe5ce9ef076acc7e45c37308db4c11faee50be36a955bf26d5"} err="failed to get container status \"ea5930774ab4c5fe5ce9ef076acc7e45c37308db4c11faee50be36a955bf26d5\": rpc error: code = NotFound desc = could not find container \"ea5930774ab4c5fe5ce9ef076acc7e45c37308db4c11faee50be36a955bf26d5\": container with ID starting with ea5930774ab4c5fe5ce9ef076acc7e45c37308db4c11faee50be36a955bf26d5 not found: ID does not exist" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.660266 4696 scope.go:117] "RemoveContainer" containerID="d5fae5a1b8c50b5b00df7d9c7769430f8c57dcdd7b6c1a1483f723a168ab920b" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.673273 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r4wjj"] Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.679316 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r4wjj"] Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.683953 4696 scope.go:117] "RemoveContainer" containerID="00957881a8f41b1f0680c35ee86ecd7b138e47286b1503e74470035ed82a453b" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.685103 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80160993-15c5-4eea-ac72-2094fe935ac1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.688655 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xd5v7"] Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.696196 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xd5v7"] Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.713847 4696 scope.go:117] "RemoveContainer" containerID="5f6e0f6fb7a3ef6587d5be1c51cf2de5b0cf603a41ad030efa6f04c645f0bd88" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.723977 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8tkdr"] Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.730530 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8tkdr"] Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.735610 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6p846"] Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.737700 4696 scope.go:117] "RemoveContainer" containerID="d5fae5a1b8c50b5b00df7d9c7769430f8c57dcdd7b6c1a1483f723a168ab920b" Mar 18 15:42:40 crc kubenswrapper[4696]: E0318 15:42:40.738023 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5fae5a1b8c50b5b00df7d9c7769430f8c57dcdd7b6c1a1483f723a168ab920b\": container with ID starting with d5fae5a1b8c50b5b00df7d9c7769430f8c57dcdd7b6c1a1483f723a168ab920b not found: ID does not exist" containerID="d5fae5a1b8c50b5b00df7d9c7769430f8c57dcdd7b6c1a1483f723a168ab920b" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.738049 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5fae5a1b8c50b5b00df7d9c7769430f8c57dcdd7b6c1a1483f723a168ab920b"} err="failed to get container status \"d5fae5a1b8c50b5b00df7d9c7769430f8c57dcdd7b6c1a1483f723a168ab920b\": rpc error: code = NotFound desc = could not find container \"d5fae5a1b8c50b5b00df7d9c7769430f8c57dcdd7b6c1a1483f723a168ab920b\": container with ID starting with d5fae5a1b8c50b5b00df7d9c7769430f8c57dcdd7b6c1a1483f723a168ab920b not found: ID does not exist" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.738069 4696 scope.go:117] "RemoveContainer" containerID="00957881a8f41b1f0680c35ee86ecd7b138e47286b1503e74470035ed82a453b" Mar 18 15:42:40 crc kubenswrapper[4696]: E0318 15:42:40.738481 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00957881a8f41b1f0680c35ee86ecd7b138e47286b1503e74470035ed82a453b\": container with ID starting with 00957881a8f41b1f0680c35ee86ecd7b138e47286b1503e74470035ed82a453b not found: ID does not exist" containerID="00957881a8f41b1f0680c35ee86ecd7b138e47286b1503e74470035ed82a453b" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.738531 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00957881a8f41b1f0680c35ee86ecd7b138e47286b1503e74470035ed82a453b"} err="failed to get container status \"00957881a8f41b1f0680c35ee86ecd7b138e47286b1503e74470035ed82a453b\": rpc error: code = NotFound desc = could not find container \"00957881a8f41b1f0680c35ee86ecd7b138e47286b1503e74470035ed82a453b\": container with ID starting with 00957881a8f41b1f0680c35ee86ecd7b138e47286b1503e74470035ed82a453b not found: ID does not exist" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.738557 4696 scope.go:117] "RemoveContainer" containerID="5f6e0f6fb7a3ef6587d5be1c51cf2de5b0cf603a41ad030efa6f04c645f0bd88" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.738752 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-v84rb"] Mar 18 15:42:40 crc kubenswrapper[4696]: E0318 15:42:40.738917 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f6e0f6fb7a3ef6587d5be1c51cf2de5b0cf603a41ad030efa6f04c645f0bd88\": container with ID starting with 5f6e0f6fb7a3ef6587d5be1c51cf2de5b0cf603a41ad030efa6f04c645f0bd88 not found: ID does not exist" containerID="5f6e0f6fb7a3ef6587d5be1c51cf2de5b0cf603a41ad030efa6f04c645f0bd88" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.738952 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f6e0f6fb7a3ef6587d5be1c51cf2de5b0cf603a41ad030efa6f04c645f0bd88"} err="failed to get container status \"5f6e0f6fb7a3ef6587d5be1c51cf2de5b0cf603a41ad030efa6f04c645f0bd88\": rpc error: code = NotFound desc = could not find container \"5f6e0f6fb7a3ef6587d5be1c51cf2de5b0cf603a41ad030efa6f04c645f0bd88\": container with ID starting with 5f6e0f6fb7a3ef6587d5be1c51cf2de5b0cf603a41ad030efa6f04c645f0bd88 not found: ID does not exist" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.738988 4696 scope.go:117] "RemoveContainer" containerID="7a7de1246e6e5bff29d1523d9413ddf3c1548e54f064c428f3f6b49119a36f79" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.743788 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6p846"] Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.747968 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nsq74"] Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.751257 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nsq74"] Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.757393 4696 scope.go:117] "RemoveContainer" containerID="b31f83c6879b07215a158be92ed20303fe502e50aa0d6b9736f1bd1a50acfba1" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.772230 4696 scope.go:117] "RemoveContainer" containerID="5280f1f44dac2775438c9fee36b10e0cead41a7c2011abb566b45d899281c72e" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.792847 4696 scope.go:117] "RemoveContainer" containerID="7a7de1246e6e5bff29d1523d9413ddf3c1548e54f064c428f3f6b49119a36f79" Mar 18 15:42:40 crc kubenswrapper[4696]: E0318 15:42:40.793239 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a7de1246e6e5bff29d1523d9413ddf3c1548e54f064c428f3f6b49119a36f79\": container with ID starting with 7a7de1246e6e5bff29d1523d9413ddf3c1548e54f064c428f3f6b49119a36f79 not found: ID does not exist" containerID="7a7de1246e6e5bff29d1523d9413ddf3c1548e54f064c428f3f6b49119a36f79" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.793283 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a7de1246e6e5bff29d1523d9413ddf3c1548e54f064c428f3f6b49119a36f79"} err="failed to get container status \"7a7de1246e6e5bff29d1523d9413ddf3c1548e54f064c428f3f6b49119a36f79\": rpc error: code = NotFound desc = could not find container \"7a7de1246e6e5bff29d1523d9413ddf3c1548e54f064c428f3f6b49119a36f79\": container with ID starting with 7a7de1246e6e5bff29d1523d9413ddf3c1548e54f064c428f3f6b49119a36f79 not found: ID does not exist" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.793305 4696 scope.go:117] "RemoveContainer" containerID="b31f83c6879b07215a158be92ed20303fe502e50aa0d6b9736f1bd1a50acfba1" Mar 18 15:42:40 crc kubenswrapper[4696]: E0318 15:42:40.793512 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b31f83c6879b07215a158be92ed20303fe502e50aa0d6b9736f1bd1a50acfba1\": container with ID starting with b31f83c6879b07215a158be92ed20303fe502e50aa0d6b9736f1bd1a50acfba1 not found: ID does not exist" containerID="b31f83c6879b07215a158be92ed20303fe502e50aa0d6b9736f1bd1a50acfba1" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.793698 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b31f83c6879b07215a158be92ed20303fe502e50aa0d6b9736f1bd1a50acfba1"} err="failed to get container status \"b31f83c6879b07215a158be92ed20303fe502e50aa0d6b9736f1bd1a50acfba1\": rpc error: code = NotFound desc = could not find container \"b31f83c6879b07215a158be92ed20303fe502e50aa0d6b9736f1bd1a50acfba1\": container with ID starting with b31f83c6879b07215a158be92ed20303fe502e50aa0d6b9736f1bd1a50acfba1 not found: ID does not exist" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.793713 4696 scope.go:117] "RemoveContainer" containerID="5280f1f44dac2775438c9fee36b10e0cead41a7c2011abb566b45d899281c72e" Mar 18 15:42:40 crc kubenswrapper[4696]: E0318 15:42:40.793890 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5280f1f44dac2775438c9fee36b10e0cead41a7c2011abb566b45d899281c72e\": container with ID starting with 5280f1f44dac2775438c9fee36b10e0cead41a7c2011abb566b45d899281c72e not found: ID does not exist" containerID="5280f1f44dac2775438c9fee36b10e0cead41a7c2011abb566b45d899281c72e" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.793909 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5280f1f44dac2775438c9fee36b10e0cead41a7c2011abb566b45d899281c72e"} err="failed to get container status \"5280f1f44dac2775438c9fee36b10e0cead41a7c2011abb566b45d899281c72e\": rpc error: code = NotFound desc = could not find container \"5280f1f44dac2775438c9fee36b10e0cead41a7c2011abb566b45d899281c72e\": container with ID starting with 5280f1f44dac2775438c9fee36b10e0cead41a7c2011abb566b45d899281c72e not found: ID does not exist" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.793934 4696 scope.go:117] "RemoveContainer" containerID="4ac84bc9741d1fba5138ba79059a6ed4b65e1c460669ecbf4510c7969d5913a2" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.814210 4696 scope.go:117] "RemoveContainer" containerID="c895ffd8b16678019de16ad285ff0f2b53dcfb5c7d972576c54fa03c72e9ff74" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.827367 4696 scope.go:117] "RemoveContainer" containerID="a2f3b78ebaee0a446e890da4ddbd789d3b459832e681023e4703ef94b863df04" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.842100 4696 scope.go:117] "RemoveContainer" containerID="4ac84bc9741d1fba5138ba79059a6ed4b65e1c460669ecbf4510c7969d5913a2" Mar 18 15:42:40 crc kubenswrapper[4696]: E0318 15:42:40.842506 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ac84bc9741d1fba5138ba79059a6ed4b65e1c460669ecbf4510c7969d5913a2\": container with ID starting with 4ac84bc9741d1fba5138ba79059a6ed4b65e1c460669ecbf4510c7969d5913a2 not found: ID does not exist" containerID="4ac84bc9741d1fba5138ba79059a6ed4b65e1c460669ecbf4510c7969d5913a2" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.842551 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ac84bc9741d1fba5138ba79059a6ed4b65e1c460669ecbf4510c7969d5913a2"} err="failed to get container status \"4ac84bc9741d1fba5138ba79059a6ed4b65e1c460669ecbf4510c7969d5913a2\": rpc error: code = NotFound desc = could not find container \"4ac84bc9741d1fba5138ba79059a6ed4b65e1c460669ecbf4510c7969d5913a2\": container with ID starting with 4ac84bc9741d1fba5138ba79059a6ed4b65e1c460669ecbf4510c7969d5913a2 not found: ID does not exist" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.842579 4696 scope.go:117] "RemoveContainer" containerID="c895ffd8b16678019de16ad285ff0f2b53dcfb5c7d972576c54fa03c72e9ff74" Mar 18 15:42:40 crc kubenswrapper[4696]: E0318 15:42:40.842920 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c895ffd8b16678019de16ad285ff0f2b53dcfb5c7d972576c54fa03c72e9ff74\": container with ID starting with c895ffd8b16678019de16ad285ff0f2b53dcfb5c7d972576c54fa03c72e9ff74 not found: ID does not exist" containerID="c895ffd8b16678019de16ad285ff0f2b53dcfb5c7d972576c54fa03c72e9ff74" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.842971 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c895ffd8b16678019de16ad285ff0f2b53dcfb5c7d972576c54fa03c72e9ff74"} err="failed to get container status \"c895ffd8b16678019de16ad285ff0f2b53dcfb5c7d972576c54fa03c72e9ff74\": rpc error: code = NotFound desc = could not find container \"c895ffd8b16678019de16ad285ff0f2b53dcfb5c7d972576c54fa03c72e9ff74\": container with ID starting with c895ffd8b16678019de16ad285ff0f2b53dcfb5c7d972576c54fa03c72e9ff74 not found: ID does not exist" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.843000 4696 scope.go:117] "RemoveContainer" containerID="a2f3b78ebaee0a446e890da4ddbd789d3b459832e681023e4703ef94b863df04" Mar 18 15:42:40 crc kubenswrapper[4696]: E0318 15:42:40.843745 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2f3b78ebaee0a446e890da4ddbd789d3b459832e681023e4703ef94b863df04\": container with ID starting with a2f3b78ebaee0a446e890da4ddbd789d3b459832e681023e4703ef94b863df04 not found: ID does not exist" containerID="a2f3b78ebaee0a446e890da4ddbd789d3b459832e681023e4703ef94b863df04" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.843772 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2f3b78ebaee0a446e890da4ddbd789d3b459832e681023e4703ef94b863df04"} err="failed to get container status \"a2f3b78ebaee0a446e890da4ddbd789d3b459832e681023e4703ef94b863df04\": rpc error: code = NotFound desc = could not find container \"a2f3b78ebaee0a446e890da4ddbd789d3b459832e681023e4703ef94b863df04\": container with ID starting with a2f3b78ebaee0a446e890da4ddbd789d3b459832e681023e4703ef94b863df04 not found: ID does not exist" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.843790 4696 scope.go:117] "RemoveContainer" containerID="944127e4043f6ef58f8f14c7dbfda1111f31e8774ec4c3ade3f8cd7edf82fb28" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.860859 4696 scope.go:117] "RemoveContainer" containerID="251fdb2378bb64a75ba20694ca9848b8c1927d4a67ff2b1b35dd5b6adb7c60ce" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.885860 4696 scope.go:117] "RemoveContainer" containerID="6c9b16f0e7f7a5bd4a2d449b0acec80c018be612f776daab6ed9ac580b2b339c" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.901379 4696 scope.go:117] "RemoveContainer" containerID="944127e4043f6ef58f8f14c7dbfda1111f31e8774ec4c3ade3f8cd7edf82fb28" Mar 18 15:42:40 crc kubenswrapper[4696]: E0318 15:42:40.901808 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"944127e4043f6ef58f8f14c7dbfda1111f31e8774ec4c3ade3f8cd7edf82fb28\": container with ID starting with 944127e4043f6ef58f8f14c7dbfda1111f31e8774ec4c3ade3f8cd7edf82fb28 not found: ID does not exist" containerID="944127e4043f6ef58f8f14c7dbfda1111f31e8774ec4c3ade3f8cd7edf82fb28" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.901931 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"944127e4043f6ef58f8f14c7dbfda1111f31e8774ec4c3ade3f8cd7edf82fb28"} err="failed to get container status \"944127e4043f6ef58f8f14c7dbfda1111f31e8774ec4c3ade3f8cd7edf82fb28\": rpc error: code = NotFound desc = could not find container \"944127e4043f6ef58f8f14c7dbfda1111f31e8774ec4c3ade3f8cd7edf82fb28\": container with ID starting with 944127e4043f6ef58f8f14c7dbfda1111f31e8774ec4c3ade3f8cd7edf82fb28 not found: ID does not exist" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.902058 4696 scope.go:117] "RemoveContainer" containerID="251fdb2378bb64a75ba20694ca9848b8c1927d4a67ff2b1b35dd5b6adb7c60ce" Mar 18 15:42:40 crc kubenswrapper[4696]: E0318 15:42:40.902407 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"251fdb2378bb64a75ba20694ca9848b8c1927d4a67ff2b1b35dd5b6adb7c60ce\": container with ID starting with 251fdb2378bb64a75ba20694ca9848b8c1927d4a67ff2b1b35dd5b6adb7c60ce not found: ID does not exist" containerID="251fdb2378bb64a75ba20694ca9848b8c1927d4a67ff2b1b35dd5b6adb7c60ce" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.902615 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"251fdb2378bb64a75ba20694ca9848b8c1927d4a67ff2b1b35dd5b6adb7c60ce"} err="failed to get container status \"251fdb2378bb64a75ba20694ca9848b8c1927d4a67ff2b1b35dd5b6adb7c60ce\": rpc error: code = NotFound desc = could not find container \"251fdb2378bb64a75ba20694ca9848b8c1927d4a67ff2b1b35dd5b6adb7c60ce\": container with ID starting with 251fdb2378bb64a75ba20694ca9848b8c1927d4a67ff2b1b35dd5b6adb7c60ce not found: ID does not exist" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.902724 4696 scope.go:117] "RemoveContainer" containerID="6c9b16f0e7f7a5bd4a2d449b0acec80c018be612f776daab6ed9ac580b2b339c" Mar 18 15:42:40 crc kubenswrapper[4696]: E0318 15:42:40.903099 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c9b16f0e7f7a5bd4a2d449b0acec80c018be612f776daab6ed9ac580b2b339c\": container with ID starting with 6c9b16f0e7f7a5bd4a2d449b0acec80c018be612f776daab6ed9ac580b2b339c not found: ID does not exist" containerID="6c9b16f0e7f7a5bd4a2d449b0acec80c018be612f776daab6ed9ac580b2b339c" Mar 18 15:42:40 crc kubenswrapper[4696]: I0318 15:42:40.903118 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c9b16f0e7f7a5bd4a2d449b0acec80c018be612f776daab6ed9ac580b2b339c"} err="failed to get container status \"6c9b16f0e7f7a5bd4a2d449b0acec80c018be612f776daab6ed9ac580b2b339c\": rpc error: code = NotFound desc = could not find container \"6c9b16f0e7f7a5bd4a2d449b0acec80c018be612f776daab6ed9ac580b2b339c\": container with ID starting with 6c9b16f0e7f7a5bd4a2d449b0acec80c018be612f776daab6ed9ac580b2b339c not found: ID does not exist" Mar 18 15:42:41 crc kubenswrapper[4696]: I0318 15:42:41.605420 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" path="/var/lib/kubelet/pods/4e22cc1d-032f-4f3a-a0ca-51708beef610/volumes" Mar 18 15:42:41 crc kubenswrapper[4696]: I0318 15:42:41.606739 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80160993-15c5-4eea-ac72-2094fe935ac1" path="/var/lib/kubelet/pods/80160993-15c5-4eea-ac72-2094fe935ac1/volumes" Mar 18 15:42:41 crc kubenswrapper[4696]: I0318 15:42:41.607311 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" path="/var/lib/kubelet/pods/aa06cecb-1f9f-431d-933f-0e87033cd695/volumes" Mar 18 15:42:41 crc kubenswrapper[4696]: I0318 15:42:41.608662 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9ddf1c7-0e1d-4f54-a93d-1665148569b2" path="/var/lib/kubelet/pods/c9ddf1c7-0e1d-4f54-a93d-1665148569b2/volumes" Mar 18 15:42:41 crc kubenswrapper[4696]: I0318 15:42:41.609444 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4224061-726c-4bee-84d0-9b1dfddcdaa4" path="/var/lib/kubelet/pods/e4224061-726c-4bee-84d0-9b1dfddcdaa4/volumes" Mar 18 15:42:41 crc kubenswrapper[4696]: I0318 15:42:41.656754 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-v84rb" event={"ID":"71a9fceb-5471-42f1-867d-28f7196daf81","Type":"ContainerStarted","Data":"36ecd14eb917a8b94c1e74114efd77e95cceceb7804ec35c2d64c248bdd173fe"} Mar 18 15:42:41 crc kubenswrapper[4696]: I0318 15:42:41.656815 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-v84rb" event={"ID":"71a9fceb-5471-42f1-867d-28f7196daf81","Type":"ContainerStarted","Data":"3290ad072c54caba01f1782cf50830735e4ec65c2b254eea3b8410fc706f7bae"} Mar 18 15:42:41 crc kubenswrapper[4696]: I0318 15:42:41.656985 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-v84rb" Mar 18 15:42:41 crc kubenswrapper[4696]: I0318 15:42:41.663895 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-v84rb" Mar 18 15:42:41 crc kubenswrapper[4696]: I0318 15:42:41.679367 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-v84rb" podStartSLOduration=2.679345367 podStartE2EDuration="2.679345367s" podCreationTimestamp="2026-03-18 15:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:42:41.67321308 +0000 UTC m=+404.679387286" watchObservedRunningTime="2026-03-18 15:42:41.679345367 +0000 UTC m=+404.685519573" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.079359 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g8lp7"] Mar 18 15:42:42 crc kubenswrapper[4696]: E0318 15:42:42.080039 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" containerName="extract-content" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.080055 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" containerName="extract-content" Mar 18 15:42:42 crc kubenswrapper[4696]: E0318 15:42:42.080064 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ddf1c7-0e1d-4f54-a93d-1665148569b2" containerName="extract-utilities" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.080071 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ddf1c7-0e1d-4f54-a93d-1665148569b2" containerName="extract-utilities" Mar 18 15:42:42 crc kubenswrapper[4696]: E0318 15:42:42.080081 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" containerName="extract-utilities" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.080088 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" containerName="extract-utilities" Mar 18 15:42:42 crc kubenswrapper[4696]: E0318 15:42:42.080095 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" containerName="extract-content" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.080101 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" containerName="extract-content" Mar 18 15:42:42 crc kubenswrapper[4696]: E0318 15:42:42.080107 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ddf1c7-0e1d-4f54-a93d-1665148569b2" containerName="registry-server" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.080112 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ddf1c7-0e1d-4f54-a93d-1665148569b2" containerName="registry-server" Mar 18 15:42:42 crc kubenswrapper[4696]: E0318 15:42:42.080122 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" containerName="registry-server" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.080129 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" containerName="registry-server" Mar 18 15:42:42 crc kubenswrapper[4696]: E0318 15:42:42.080140 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80160993-15c5-4eea-ac72-2094fe935ac1" containerName="extract-content" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.080146 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="80160993-15c5-4eea-ac72-2094fe935ac1" containerName="extract-content" Mar 18 15:42:42 crc kubenswrapper[4696]: E0318 15:42:42.080154 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" containerName="extract-utilities" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.080159 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" containerName="extract-utilities" Mar 18 15:42:42 crc kubenswrapper[4696]: E0318 15:42:42.080165 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80160993-15c5-4eea-ac72-2094fe935ac1" containerName="extract-utilities" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.080170 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="80160993-15c5-4eea-ac72-2094fe935ac1" containerName="extract-utilities" Mar 18 15:42:42 crc kubenswrapper[4696]: E0318 15:42:42.080178 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" containerName="registry-server" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.080183 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" containerName="registry-server" Mar 18 15:42:42 crc kubenswrapper[4696]: E0318 15:42:42.080191 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80160993-15c5-4eea-ac72-2094fe935ac1" containerName="registry-server" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.080431 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="80160993-15c5-4eea-ac72-2094fe935ac1" containerName="registry-server" Mar 18 15:42:42 crc kubenswrapper[4696]: E0318 15:42:42.080458 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4224061-726c-4bee-84d0-9b1dfddcdaa4" containerName="marketplace-operator" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.080465 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4224061-726c-4bee-84d0-9b1dfddcdaa4" containerName="marketplace-operator" Mar 18 15:42:42 crc kubenswrapper[4696]: E0318 15:42:42.080473 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ddf1c7-0e1d-4f54-a93d-1665148569b2" containerName="extract-content" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.080478 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ddf1c7-0e1d-4f54-a93d-1665148569b2" containerName="extract-content" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.080669 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ddf1c7-0e1d-4f54-a93d-1665148569b2" containerName="registry-server" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.080680 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4224061-726c-4bee-84d0-9b1dfddcdaa4" containerName="marketplace-operator" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.080699 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="80160993-15c5-4eea-ac72-2094fe935ac1" containerName="registry-server" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.080793 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e22cc1d-032f-4f3a-a0ca-51708beef610" containerName="registry-server" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.080803 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa06cecb-1f9f-431d-933f-0e87033cd695" containerName="registry-server" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.082915 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g8lp7" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.085347 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.087064 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g8lp7"] Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.134311 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4c082b9-ab53-4ff0-94db-c714a1fc683e-catalog-content\") pod \"redhat-marketplace-g8lp7\" (UID: \"a4c082b9-ab53-4ff0-94db-c714a1fc683e\") " pod="openshift-marketplace/redhat-marketplace-g8lp7" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.134408 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4c082b9-ab53-4ff0-94db-c714a1fc683e-utilities\") pod \"redhat-marketplace-g8lp7\" (UID: \"a4c082b9-ab53-4ff0-94db-c714a1fc683e\") " pod="openshift-marketplace/redhat-marketplace-g8lp7" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.134461 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg8jb\" (UniqueName: \"kubernetes.io/projected/a4c082b9-ab53-4ff0-94db-c714a1fc683e-kube-api-access-pg8jb\") pod \"redhat-marketplace-g8lp7\" (UID: \"a4c082b9-ab53-4ff0-94db-c714a1fc683e\") " pod="openshift-marketplace/redhat-marketplace-g8lp7" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.184597 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.184666 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.235848 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4c082b9-ab53-4ff0-94db-c714a1fc683e-utilities\") pod \"redhat-marketplace-g8lp7\" (UID: \"a4c082b9-ab53-4ff0-94db-c714a1fc683e\") " pod="openshift-marketplace/redhat-marketplace-g8lp7" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.235916 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg8jb\" (UniqueName: \"kubernetes.io/projected/a4c082b9-ab53-4ff0-94db-c714a1fc683e-kube-api-access-pg8jb\") pod \"redhat-marketplace-g8lp7\" (UID: \"a4c082b9-ab53-4ff0-94db-c714a1fc683e\") " pod="openshift-marketplace/redhat-marketplace-g8lp7" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.235976 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4c082b9-ab53-4ff0-94db-c714a1fc683e-catalog-content\") pod \"redhat-marketplace-g8lp7\" (UID: \"a4c082b9-ab53-4ff0-94db-c714a1fc683e\") " pod="openshift-marketplace/redhat-marketplace-g8lp7" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.236260 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4c082b9-ab53-4ff0-94db-c714a1fc683e-utilities\") pod \"redhat-marketplace-g8lp7\" (UID: \"a4c082b9-ab53-4ff0-94db-c714a1fc683e\") " pod="openshift-marketplace/redhat-marketplace-g8lp7" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.236366 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4c082b9-ab53-4ff0-94db-c714a1fc683e-catalog-content\") pod \"redhat-marketplace-g8lp7\" (UID: \"a4c082b9-ab53-4ff0-94db-c714a1fc683e\") " pod="openshift-marketplace/redhat-marketplace-g8lp7" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.254625 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg8jb\" (UniqueName: \"kubernetes.io/projected/a4c082b9-ab53-4ff0-94db-c714a1fc683e-kube-api-access-pg8jb\") pod \"redhat-marketplace-g8lp7\" (UID: \"a4c082b9-ab53-4ff0-94db-c714a1fc683e\") " pod="openshift-marketplace/redhat-marketplace-g8lp7" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.272902 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gxwqj"] Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.274111 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gxwqj" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.276029 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.282695 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gxwqj"] Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.337182 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cee4651c-7e34-42f9-bb81-9537803fa622-utilities\") pod \"redhat-operators-gxwqj\" (UID: \"cee4651c-7e34-42f9-bb81-9537803fa622\") " pod="openshift-marketplace/redhat-operators-gxwqj" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.337244 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xmp2\" (UniqueName: \"kubernetes.io/projected/cee4651c-7e34-42f9-bb81-9537803fa622-kube-api-access-6xmp2\") pod \"redhat-operators-gxwqj\" (UID: \"cee4651c-7e34-42f9-bb81-9537803fa622\") " pod="openshift-marketplace/redhat-operators-gxwqj" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.337270 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cee4651c-7e34-42f9-bb81-9537803fa622-catalog-content\") pod \"redhat-operators-gxwqj\" (UID: \"cee4651c-7e34-42f9-bb81-9537803fa622\") " pod="openshift-marketplace/redhat-operators-gxwqj" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.438550 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cee4651c-7e34-42f9-bb81-9537803fa622-utilities\") pod \"redhat-operators-gxwqj\" (UID: \"cee4651c-7e34-42f9-bb81-9537803fa622\") " pod="openshift-marketplace/redhat-operators-gxwqj" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.438626 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xmp2\" (UniqueName: \"kubernetes.io/projected/cee4651c-7e34-42f9-bb81-9537803fa622-kube-api-access-6xmp2\") pod \"redhat-operators-gxwqj\" (UID: \"cee4651c-7e34-42f9-bb81-9537803fa622\") " pod="openshift-marketplace/redhat-operators-gxwqj" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.438652 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cee4651c-7e34-42f9-bb81-9537803fa622-catalog-content\") pod \"redhat-operators-gxwqj\" (UID: \"cee4651c-7e34-42f9-bb81-9537803fa622\") " pod="openshift-marketplace/redhat-operators-gxwqj" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.439099 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cee4651c-7e34-42f9-bb81-9537803fa622-utilities\") pod \"redhat-operators-gxwqj\" (UID: \"cee4651c-7e34-42f9-bb81-9537803fa622\") " pod="openshift-marketplace/redhat-operators-gxwqj" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.439099 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cee4651c-7e34-42f9-bb81-9537803fa622-catalog-content\") pod \"redhat-operators-gxwqj\" (UID: \"cee4651c-7e34-42f9-bb81-9537803fa622\") " pod="openshift-marketplace/redhat-operators-gxwqj" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.449671 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g8lp7" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.459876 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xmp2\" (UniqueName: \"kubernetes.io/projected/cee4651c-7e34-42f9-bb81-9537803fa622-kube-api-access-6xmp2\") pod \"redhat-operators-gxwqj\" (UID: \"cee4651c-7e34-42f9-bb81-9537803fa622\") " pod="openshift-marketplace/redhat-operators-gxwqj" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.591225 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gxwqj" Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.655737 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g8lp7"] Mar 18 15:42:42 crc kubenswrapper[4696]: W0318 15:42:42.668306 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4c082b9_ab53_4ff0_94db_c714a1fc683e.slice/crio-223740e46a0009556fe72e65d2fd8046213933437480187eff37f8feb3b0c48a WatchSource:0}: Error finding container 223740e46a0009556fe72e65d2fd8046213933437480187eff37f8feb3b0c48a: Status 404 returned error can't find the container with id 223740e46a0009556fe72e65d2fd8046213933437480187eff37f8feb3b0c48a Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.699608 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g8lp7" event={"ID":"a4c082b9-ab53-4ff0-94db-c714a1fc683e","Type":"ContainerStarted","Data":"223740e46a0009556fe72e65d2fd8046213933437480187eff37f8feb3b0c48a"} Mar 18 15:42:42 crc kubenswrapper[4696]: I0318 15:42:42.789068 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gxwqj"] Mar 18 15:42:42 crc kubenswrapper[4696]: W0318 15:42:42.795142 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcee4651c_7e34_42f9_bb81_9537803fa622.slice/crio-c84647d097619702e7e00b1e786d0990520c302cf29aac31dee929e75b548639 WatchSource:0}: Error finding container c84647d097619702e7e00b1e786d0990520c302cf29aac31dee929e75b548639: Status 404 returned error can't find the container with id c84647d097619702e7e00b1e786d0990520c302cf29aac31dee929e75b548639 Mar 18 15:42:43 crc kubenswrapper[4696]: I0318 15:42:43.704450 4696 generic.go:334] "Generic (PLEG): container finished" podID="a4c082b9-ab53-4ff0-94db-c714a1fc683e" containerID="55289e11025acb48a6563db375f36c8dd58c94ecc83c4959dd512985bfc5178b" exitCode=0 Mar 18 15:42:43 crc kubenswrapper[4696]: I0318 15:42:43.704549 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g8lp7" event={"ID":"a4c082b9-ab53-4ff0-94db-c714a1fc683e","Type":"ContainerDied","Data":"55289e11025acb48a6563db375f36c8dd58c94ecc83c4959dd512985bfc5178b"} Mar 18 15:42:43 crc kubenswrapper[4696]: I0318 15:42:43.706500 4696 generic.go:334] "Generic (PLEG): container finished" podID="cee4651c-7e34-42f9-bb81-9537803fa622" containerID="7c3bc34bfb15f69c948d069c4d17877d2d9f69a84712c8d047d2a694b9cf4e25" exitCode=0 Mar 18 15:42:43 crc kubenswrapper[4696]: I0318 15:42:43.706641 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxwqj" event={"ID":"cee4651c-7e34-42f9-bb81-9537803fa622","Type":"ContainerDied","Data":"7c3bc34bfb15f69c948d069c4d17877d2d9f69a84712c8d047d2a694b9cf4e25"} Mar 18 15:42:43 crc kubenswrapper[4696]: I0318 15:42:43.706718 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxwqj" event={"ID":"cee4651c-7e34-42f9-bb81-9537803fa622","Type":"ContainerStarted","Data":"c84647d097619702e7e00b1e786d0990520c302cf29aac31dee929e75b548639"} Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.474743 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dbfmt"] Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.475783 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbfmt" Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.478027 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.489748 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dbfmt"] Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.577101 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c88d01e2-45ab-4111-9029-3f3e2c12c58e-utilities\") pod \"community-operators-dbfmt\" (UID: \"c88d01e2-45ab-4111-9029-3f3e2c12c58e\") " pod="openshift-marketplace/community-operators-dbfmt" Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.577175 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c88d01e2-45ab-4111-9029-3f3e2c12c58e-catalog-content\") pod \"community-operators-dbfmt\" (UID: \"c88d01e2-45ab-4111-9029-3f3e2c12c58e\") " pod="openshift-marketplace/community-operators-dbfmt" Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.577200 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87jzc\" (UniqueName: \"kubernetes.io/projected/c88d01e2-45ab-4111-9029-3f3e2c12c58e-kube-api-access-87jzc\") pod \"community-operators-dbfmt\" (UID: \"c88d01e2-45ab-4111-9029-3f3e2c12c58e\") " pod="openshift-marketplace/community-operators-dbfmt" Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.677672 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mvwp5"] Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.678247 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c88d01e2-45ab-4111-9029-3f3e2c12c58e-catalog-content\") pod \"community-operators-dbfmt\" (UID: \"c88d01e2-45ab-4111-9029-3f3e2c12c58e\") " pod="openshift-marketplace/community-operators-dbfmt" Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.678276 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87jzc\" (UniqueName: \"kubernetes.io/projected/c88d01e2-45ab-4111-9029-3f3e2c12c58e-kube-api-access-87jzc\") pod \"community-operators-dbfmt\" (UID: \"c88d01e2-45ab-4111-9029-3f3e2c12c58e\") " pod="openshift-marketplace/community-operators-dbfmt" Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.678387 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c88d01e2-45ab-4111-9029-3f3e2c12c58e-utilities\") pod \"community-operators-dbfmt\" (UID: \"c88d01e2-45ab-4111-9029-3f3e2c12c58e\") " pod="openshift-marketplace/community-operators-dbfmt" Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.679322 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c88d01e2-45ab-4111-9029-3f3e2c12c58e-catalog-content\") pod \"community-operators-dbfmt\" (UID: \"c88d01e2-45ab-4111-9029-3f3e2c12c58e\") " pod="openshift-marketplace/community-operators-dbfmt" Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.680412 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mvwp5" Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.683922 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c88d01e2-45ab-4111-9029-3f3e2c12c58e-utilities\") pod \"community-operators-dbfmt\" (UID: \"c88d01e2-45ab-4111-9029-3f3e2c12c58e\") " pod="openshift-marketplace/community-operators-dbfmt" Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.686974 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.690832 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mvwp5"] Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.701414 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87jzc\" (UniqueName: \"kubernetes.io/projected/c88d01e2-45ab-4111-9029-3f3e2c12c58e-kube-api-access-87jzc\") pod \"community-operators-dbfmt\" (UID: \"c88d01e2-45ab-4111-9029-3f3e2c12c58e\") " pod="openshift-marketplace/community-operators-dbfmt" Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.780772 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0f11748-034a-4f55-9da4-ee34ca565a33-catalog-content\") pod \"certified-operators-mvwp5\" (UID: \"c0f11748-034a-4f55-9da4-ee34ca565a33\") " pod="openshift-marketplace/certified-operators-mvwp5" Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.781660 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98nl8\" (UniqueName: \"kubernetes.io/projected/c0f11748-034a-4f55-9da4-ee34ca565a33-kube-api-access-98nl8\") pod \"certified-operators-mvwp5\" (UID: \"c0f11748-034a-4f55-9da4-ee34ca565a33\") " pod="openshift-marketplace/certified-operators-mvwp5" Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.781770 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0f11748-034a-4f55-9da4-ee34ca565a33-utilities\") pod \"certified-operators-mvwp5\" (UID: \"c0f11748-034a-4f55-9da4-ee34ca565a33\") " pod="openshift-marketplace/certified-operators-mvwp5" Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.823393 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbfmt" Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.883011 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0f11748-034a-4f55-9da4-ee34ca565a33-catalog-content\") pod \"certified-operators-mvwp5\" (UID: \"c0f11748-034a-4f55-9da4-ee34ca565a33\") " pod="openshift-marketplace/certified-operators-mvwp5" Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.883070 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98nl8\" (UniqueName: \"kubernetes.io/projected/c0f11748-034a-4f55-9da4-ee34ca565a33-kube-api-access-98nl8\") pod \"certified-operators-mvwp5\" (UID: \"c0f11748-034a-4f55-9da4-ee34ca565a33\") " pod="openshift-marketplace/certified-operators-mvwp5" Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.883091 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0f11748-034a-4f55-9da4-ee34ca565a33-utilities\") pod \"certified-operators-mvwp5\" (UID: \"c0f11748-034a-4f55-9da4-ee34ca565a33\") " pod="openshift-marketplace/certified-operators-mvwp5" Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.883755 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0f11748-034a-4f55-9da4-ee34ca565a33-utilities\") pod \"certified-operators-mvwp5\" (UID: \"c0f11748-034a-4f55-9da4-ee34ca565a33\") " pod="openshift-marketplace/certified-operators-mvwp5" Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.883764 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0f11748-034a-4f55-9da4-ee34ca565a33-catalog-content\") pod \"certified-operators-mvwp5\" (UID: \"c0f11748-034a-4f55-9da4-ee34ca565a33\") " pod="openshift-marketplace/certified-operators-mvwp5" Mar 18 15:42:44 crc kubenswrapper[4696]: I0318 15:42:44.900779 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98nl8\" (UniqueName: \"kubernetes.io/projected/c0f11748-034a-4f55-9da4-ee34ca565a33-kube-api-access-98nl8\") pod \"certified-operators-mvwp5\" (UID: \"c0f11748-034a-4f55-9da4-ee34ca565a33\") " pod="openshift-marketplace/certified-operators-mvwp5" Mar 18 15:42:45 crc kubenswrapper[4696]: I0318 15:42:45.022601 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mvwp5" Mar 18 15:42:45 crc kubenswrapper[4696]: I0318 15:42:45.031912 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dbfmt"] Mar 18 15:42:45 crc kubenswrapper[4696]: W0318 15:42:45.080479 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc88d01e2_45ab_4111_9029_3f3e2c12c58e.slice/crio-8c8b49bad3efa801a75ca2f7c5c1721617b73250cb6413c5be8f3dc1e73bb21e WatchSource:0}: Error finding container 8c8b49bad3efa801a75ca2f7c5c1721617b73250cb6413c5be8f3dc1e73bb21e: Status 404 returned error can't find the container with id 8c8b49bad3efa801a75ca2f7c5c1721617b73250cb6413c5be8f3dc1e73bb21e Mar 18 15:42:45 crc kubenswrapper[4696]: I0318 15:42:45.191504 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mvwp5"] Mar 18 15:42:45 crc kubenswrapper[4696]: W0318 15:42:45.208474 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0f11748_034a_4f55_9da4_ee34ca565a33.slice/crio-c79e5a10b6a45f1b6e552f0c1f27ffeb335b35938518faa19f8b60bf45d58e1a WatchSource:0}: Error finding container c79e5a10b6a45f1b6e552f0c1f27ffeb335b35938518faa19f8b60bf45d58e1a: Status 404 returned error can't find the container with id c79e5a10b6a45f1b6e552f0c1f27ffeb335b35938518faa19f8b60bf45d58e1a Mar 18 15:42:45 crc kubenswrapper[4696]: I0318 15:42:45.723333 4696 generic.go:334] "Generic (PLEG): container finished" podID="c88d01e2-45ab-4111-9029-3f3e2c12c58e" containerID="4b4d770fe5b3c4c6c93f92cdada59663aef685692a5dac586cd38133102dcd38" exitCode=0 Mar 18 15:42:45 crc kubenswrapper[4696]: I0318 15:42:45.723451 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbfmt" event={"ID":"c88d01e2-45ab-4111-9029-3f3e2c12c58e","Type":"ContainerDied","Data":"4b4d770fe5b3c4c6c93f92cdada59663aef685692a5dac586cd38133102dcd38"} Mar 18 15:42:45 crc kubenswrapper[4696]: I0318 15:42:45.723575 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbfmt" event={"ID":"c88d01e2-45ab-4111-9029-3f3e2c12c58e","Type":"ContainerStarted","Data":"8c8b49bad3efa801a75ca2f7c5c1721617b73250cb6413c5be8f3dc1e73bb21e"} Mar 18 15:42:45 crc kubenswrapper[4696]: I0318 15:42:45.725151 4696 generic.go:334] "Generic (PLEG): container finished" podID="c0f11748-034a-4f55-9da4-ee34ca565a33" containerID="6f1f5ed345db45d91b82576d4db39061865005aae8d7313f1c538b4257d4e702" exitCode=0 Mar 18 15:42:45 crc kubenswrapper[4696]: I0318 15:42:45.725247 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvwp5" event={"ID":"c0f11748-034a-4f55-9da4-ee34ca565a33","Type":"ContainerDied","Data":"6f1f5ed345db45d91b82576d4db39061865005aae8d7313f1c538b4257d4e702"} Mar 18 15:42:45 crc kubenswrapper[4696]: I0318 15:42:45.725271 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvwp5" event={"ID":"c0f11748-034a-4f55-9da4-ee34ca565a33","Type":"ContainerStarted","Data":"c79e5a10b6a45f1b6e552f0c1f27ffeb335b35938518faa19f8b60bf45d58e1a"} Mar 18 15:42:45 crc kubenswrapper[4696]: I0318 15:42:45.727861 4696 generic.go:334] "Generic (PLEG): container finished" podID="a4c082b9-ab53-4ff0-94db-c714a1fc683e" containerID="f0c8180955fcaae7ba0211427dd6202c7e0f329610300c11b169fae92393a228" exitCode=0 Mar 18 15:42:45 crc kubenswrapper[4696]: I0318 15:42:45.727953 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g8lp7" event={"ID":"a4c082b9-ab53-4ff0-94db-c714a1fc683e","Type":"ContainerDied","Data":"f0c8180955fcaae7ba0211427dd6202c7e0f329610300c11b169fae92393a228"} Mar 18 15:42:45 crc kubenswrapper[4696]: I0318 15:42:45.732461 4696 generic.go:334] "Generic (PLEG): container finished" podID="cee4651c-7e34-42f9-bb81-9537803fa622" containerID="a05d70feae69385aed64cf9538cbc5ba20c36ff65429f7c0ef2eb919ceb5bc17" exitCode=0 Mar 18 15:42:45 crc kubenswrapper[4696]: I0318 15:42:45.732543 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxwqj" event={"ID":"cee4651c-7e34-42f9-bb81-9537803fa622","Type":"ContainerDied","Data":"a05d70feae69385aed64cf9538cbc5ba20c36ff65429f7c0ef2eb919ceb5bc17"} Mar 18 15:42:46 crc kubenswrapper[4696]: I0318 15:42:46.197126 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-wgfqb" Mar 18 15:42:46 crc kubenswrapper[4696]: I0318 15:42:46.270148 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g894v"] Mar 18 15:42:46 crc kubenswrapper[4696]: I0318 15:42:46.741482 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxwqj" event={"ID":"cee4651c-7e34-42f9-bb81-9537803fa622","Type":"ContainerStarted","Data":"6b129f5ed6bef1fd2d57f01763d905d4ceb7cfd0e6a1746fde7e849a32932642"} Mar 18 15:42:46 crc kubenswrapper[4696]: I0318 15:42:46.743603 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g8lp7" event={"ID":"a4c082b9-ab53-4ff0-94db-c714a1fc683e","Type":"ContainerStarted","Data":"b3810c0442c00740ea33f5eda8140f61cb06c755cd4ffb00da100498f4393d1f"} Mar 18 15:42:46 crc kubenswrapper[4696]: I0318 15:42:46.763081 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gxwqj" podStartSLOduration=2.166961537 podStartE2EDuration="4.763063422s" podCreationTimestamp="2026-03-18 15:42:42 +0000 UTC" firstStartedPulling="2026-03-18 15:42:43.70803313 +0000 UTC m=+406.714207346" lastFinishedPulling="2026-03-18 15:42:46.304135025 +0000 UTC m=+409.310309231" observedRunningTime="2026-03-18 15:42:46.761911483 +0000 UTC m=+409.768085689" watchObservedRunningTime="2026-03-18 15:42:46.763063422 +0000 UTC m=+409.769237628" Mar 18 15:42:46 crc kubenswrapper[4696]: I0318 15:42:46.790232 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g8lp7" podStartSLOduration=2.201143125 podStartE2EDuration="4.790212629s" podCreationTimestamp="2026-03-18 15:42:42 +0000 UTC" firstStartedPulling="2026-03-18 15:42:43.706655415 +0000 UTC m=+406.712829621" lastFinishedPulling="2026-03-18 15:42:46.295724919 +0000 UTC m=+409.301899125" observedRunningTime="2026-03-18 15:42:46.782675776 +0000 UTC m=+409.788849982" watchObservedRunningTime="2026-03-18 15:42:46.790212629 +0000 UTC m=+409.796386835" Mar 18 15:42:47 crc kubenswrapper[4696]: I0318 15:42:47.751269 4696 generic.go:334] "Generic (PLEG): container finished" podID="c0f11748-034a-4f55-9da4-ee34ca565a33" containerID="2659e2ed4c53d96f6012ecca5b36ef1fd1a75ece6d88156d954498619dd91793" exitCode=0 Mar 18 15:42:47 crc kubenswrapper[4696]: I0318 15:42:47.751608 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvwp5" event={"ID":"c0f11748-034a-4f55-9da4-ee34ca565a33","Type":"ContainerDied","Data":"2659e2ed4c53d96f6012ecca5b36ef1fd1a75ece6d88156d954498619dd91793"} Mar 18 15:42:47 crc kubenswrapper[4696]: I0318 15:42:47.756181 4696 generic.go:334] "Generic (PLEG): container finished" podID="c88d01e2-45ab-4111-9029-3f3e2c12c58e" containerID="fc6c47bcca255d1a21ed68dd9c06773fadadc36646fa5e117cd59307bc29dae5" exitCode=0 Mar 18 15:42:47 crc kubenswrapper[4696]: I0318 15:42:47.757531 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbfmt" event={"ID":"c88d01e2-45ab-4111-9029-3f3e2c12c58e","Type":"ContainerDied","Data":"fc6c47bcca255d1a21ed68dd9c06773fadadc36646fa5e117cd59307bc29dae5"} Mar 18 15:42:48 crc kubenswrapper[4696]: I0318 15:42:48.765302 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbfmt" event={"ID":"c88d01e2-45ab-4111-9029-3f3e2c12c58e","Type":"ContainerStarted","Data":"f61f5353f7533644a92ccc1738c3e3b9d04ea62c8e2e3648200fe769a91edd5d"} Mar 18 15:42:48 crc kubenswrapper[4696]: I0318 15:42:48.768243 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvwp5" event={"ID":"c0f11748-034a-4f55-9da4-ee34ca565a33","Type":"ContainerStarted","Data":"04f23489a5437629bea5eca4bae390913bad593d4a2314695c334de68dc506e4"} Mar 18 15:42:48 crc kubenswrapper[4696]: I0318 15:42:48.790796 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dbfmt" podStartSLOduration=2.136486632 podStartE2EDuration="4.790771461s" podCreationTimestamp="2026-03-18 15:42:44 +0000 UTC" firstStartedPulling="2026-03-18 15:42:45.731841077 +0000 UTC m=+408.738015293" lastFinishedPulling="2026-03-18 15:42:48.386125916 +0000 UTC m=+411.392300122" observedRunningTime="2026-03-18 15:42:48.78839938 +0000 UTC m=+411.794573586" watchObservedRunningTime="2026-03-18 15:42:48.790771461 +0000 UTC m=+411.796945667" Mar 18 15:42:48 crc kubenswrapper[4696]: I0318 15:42:48.810210 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mvwp5" podStartSLOduration=2.374310166 podStartE2EDuration="4.810187259s" podCreationTimestamp="2026-03-18 15:42:44 +0000 UTC" firstStartedPulling="2026-03-18 15:42:45.727607099 +0000 UTC m=+408.733781305" lastFinishedPulling="2026-03-18 15:42:48.163484192 +0000 UTC m=+411.169658398" observedRunningTime="2026-03-18 15:42:48.808049684 +0000 UTC m=+411.814223890" watchObservedRunningTime="2026-03-18 15:42:48.810187259 +0000 UTC m=+411.816361475" Mar 18 15:42:52 crc kubenswrapper[4696]: I0318 15:42:52.450811 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g8lp7" Mar 18 15:42:52 crc kubenswrapper[4696]: I0318 15:42:52.451439 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g8lp7" Mar 18 15:42:52 crc kubenswrapper[4696]: I0318 15:42:52.495933 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g8lp7" Mar 18 15:42:52 crc kubenswrapper[4696]: I0318 15:42:52.591539 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gxwqj" Mar 18 15:42:52 crc kubenswrapper[4696]: I0318 15:42:52.591856 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gxwqj" Mar 18 15:42:52 crc kubenswrapper[4696]: I0318 15:42:52.632601 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gxwqj" Mar 18 15:42:52 crc kubenswrapper[4696]: I0318 15:42:52.823271 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gxwqj" Mar 18 15:42:52 crc kubenswrapper[4696]: I0318 15:42:52.824081 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g8lp7" Mar 18 15:42:54 crc kubenswrapper[4696]: I0318 15:42:54.823498 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dbfmt" Mar 18 15:42:54 crc kubenswrapper[4696]: I0318 15:42:54.823616 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dbfmt" Mar 18 15:42:54 crc kubenswrapper[4696]: I0318 15:42:54.868069 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dbfmt" Mar 18 15:42:55 crc kubenswrapper[4696]: I0318 15:42:55.023372 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mvwp5" Mar 18 15:42:55 crc kubenswrapper[4696]: I0318 15:42:55.023427 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mvwp5" Mar 18 15:42:55 crc kubenswrapper[4696]: I0318 15:42:55.059834 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mvwp5" Mar 18 15:42:55 crc kubenswrapper[4696]: I0318 15:42:55.846494 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dbfmt" Mar 18 15:42:55 crc kubenswrapper[4696]: I0318 15:42:55.848663 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mvwp5" Mar 18 15:43:11 crc kubenswrapper[4696]: I0318 15:43:11.318855 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-g894v" podUID="5744ac11-6c36-4634-903e-298dc7b5ce45" containerName="registry" containerID="cri-o://e85e3d189cdcdbdb9cd340e02a41b68cda153c926b52711b9f8e181d7b9ea3b1" gracePeriod=30 Mar 18 15:43:11 crc kubenswrapper[4696]: I0318 15:43:11.505508 4696 patch_prober.go:28] interesting pod/image-registry-697d97f7c8-g894v container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.34:5000/healthz\": dial tcp 10.217.0.34:5000: connect: connection refused" start-of-body= Mar 18 15:43:11 crc kubenswrapper[4696]: I0318 15:43:11.505951 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-697d97f7c8-g894v" podUID="5744ac11-6c36-4634-903e-298dc7b5ce45" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.34:5000/healthz\": dial tcp 10.217.0.34:5000: connect: connection refused" Mar 18 15:43:11 crc kubenswrapper[4696]: I0318 15:43:11.905176 4696 generic.go:334] "Generic (PLEG): container finished" podID="5744ac11-6c36-4634-903e-298dc7b5ce45" containerID="e85e3d189cdcdbdb9cd340e02a41b68cda153c926b52711b9f8e181d7b9ea3b1" exitCode=0 Mar 18 15:43:11 crc kubenswrapper[4696]: I0318 15:43:11.905222 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g894v" event={"ID":"5744ac11-6c36-4634-903e-298dc7b5ce45","Type":"ContainerDied","Data":"e85e3d189cdcdbdb9cd340e02a41b68cda153c926b52711b9f8e181d7b9ea3b1"} Mar 18 15:43:11 crc kubenswrapper[4696]: I0318 15:43:11.905246 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g894v" event={"ID":"5744ac11-6c36-4634-903e-298dc7b5ce45","Type":"ContainerDied","Data":"c7275eacd8a9e8df4984f3e3106ef4f0a7e433b98a6cfad329bdc9d48c3f20d0"} Mar 18 15:43:11 crc kubenswrapper[4696]: I0318 15:43:11.905256 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7275eacd8a9e8df4984f3e3106ef4f0a7e433b98a6cfad329bdc9d48c3f20d0" Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.082399 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.158363 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5744ac11-6c36-4634-903e-298dc7b5ce45-trusted-ca\") pod \"5744ac11-6c36-4634-903e-298dc7b5ce45\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.158428 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5744ac11-6c36-4634-903e-298dc7b5ce45-installation-pull-secrets\") pod \"5744ac11-6c36-4634-903e-298dc7b5ce45\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.158512 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmj4r\" (UniqueName: \"kubernetes.io/projected/5744ac11-6c36-4634-903e-298dc7b5ce45-kube-api-access-dmj4r\") pod \"5744ac11-6c36-4634-903e-298dc7b5ce45\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.158562 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5744ac11-6c36-4634-903e-298dc7b5ce45-ca-trust-extracted\") pod \"5744ac11-6c36-4634-903e-298dc7b5ce45\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.158601 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5744ac11-6c36-4634-903e-298dc7b5ce45-bound-sa-token\") pod \"5744ac11-6c36-4634-903e-298dc7b5ce45\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.158621 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5744ac11-6c36-4634-903e-298dc7b5ce45-registry-tls\") pod \"5744ac11-6c36-4634-903e-298dc7b5ce45\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.158650 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5744ac11-6c36-4634-903e-298dc7b5ce45-registry-certificates\") pod \"5744ac11-6c36-4634-903e-298dc7b5ce45\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.158780 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"5744ac11-6c36-4634-903e-298dc7b5ce45\" (UID: \"5744ac11-6c36-4634-903e-298dc7b5ce45\") " Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.159297 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5744ac11-6c36-4634-903e-298dc7b5ce45-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5744ac11-6c36-4634-903e-298dc7b5ce45" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.159678 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5744ac11-6c36-4634-903e-298dc7b5ce45-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5744ac11-6c36-4634-903e-298dc7b5ce45" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.164305 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5744ac11-6c36-4634-903e-298dc7b5ce45-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5744ac11-6c36-4634-903e-298dc7b5ce45" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.164605 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5744ac11-6c36-4634-903e-298dc7b5ce45-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5744ac11-6c36-4634-903e-298dc7b5ce45" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.166511 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5744ac11-6c36-4634-903e-298dc7b5ce45-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5744ac11-6c36-4634-903e-298dc7b5ce45" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.166655 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5744ac11-6c36-4634-903e-298dc7b5ce45-kube-api-access-dmj4r" (OuterVolumeSpecName: "kube-api-access-dmj4r") pod "5744ac11-6c36-4634-903e-298dc7b5ce45" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45"). InnerVolumeSpecName "kube-api-access-dmj4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.172063 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "5744ac11-6c36-4634-903e-298dc7b5ce45" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.175962 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5744ac11-6c36-4634-903e-298dc7b5ce45-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5744ac11-6c36-4634-903e-298dc7b5ce45" (UID: "5744ac11-6c36-4634-903e-298dc7b5ce45"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.184476 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.184549 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.259816 4696 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5744ac11-6c36-4634-903e-298dc7b5ce45-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.259855 4696 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5744ac11-6c36-4634-903e-298dc7b5ce45-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.259867 4696 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5744ac11-6c36-4634-903e-298dc7b5ce45-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.259878 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmj4r\" (UniqueName: \"kubernetes.io/projected/5744ac11-6c36-4634-903e-298dc7b5ce45-kube-api-access-dmj4r\") on node \"crc\" DevicePath \"\"" Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.259886 4696 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5744ac11-6c36-4634-903e-298dc7b5ce45-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.259895 4696 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5744ac11-6c36-4634-903e-298dc7b5ce45-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.259902 4696 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5744ac11-6c36-4634-903e-298dc7b5ce45-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.912306 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g894v" Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.949760 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g894v"] Mar 18 15:43:12 crc kubenswrapper[4696]: I0318 15:43:12.955148 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g894v"] Mar 18 15:43:13 crc kubenswrapper[4696]: I0318 15:43:13.608435 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5744ac11-6c36-4634-903e-298dc7b5ce45" path="/var/lib/kubelet/pods/5744ac11-6c36-4634-903e-298dc7b5ce45/volumes" Mar 18 15:43:42 crc kubenswrapper[4696]: I0318 15:43:42.184030 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:43:42 crc kubenswrapper[4696]: I0318 15:43:42.184677 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:43:42 crc kubenswrapper[4696]: I0318 15:43:42.184732 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 15:43:42 crc kubenswrapper[4696]: I0318 15:43:42.185490 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a757e73374a5d2719000efd9790c0f7bc244089cee9596be88ce0ad60115e2a2"} pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 15:43:42 crc kubenswrapper[4696]: I0318 15:43:42.185565 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" containerID="cri-o://a757e73374a5d2719000efd9790c0f7bc244089cee9596be88ce0ad60115e2a2" gracePeriod=600 Mar 18 15:43:43 crc kubenswrapper[4696]: I0318 15:43:43.071124 4696 generic.go:334] "Generic (PLEG): container finished" podID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerID="a757e73374a5d2719000efd9790c0f7bc244089cee9596be88ce0ad60115e2a2" exitCode=0 Mar 18 15:43:43 crc kubenswrapper[4696]: I0318 15:43:43.071212 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerDied","Data":"a757e73374a5d2719000efd9790c0f7bc244089cee9596be88ce0ad60115e2a2"} Mar 18 15:43:43 crc kubenswrapper[4696]: I0318 15:43:43.071731 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerStarted","Data":"ea4660abb995b59b3411f3109e892dc576f9d2448c545e6fbdbaeccd931d4b49"} Mar 18 15:43:43 crc kubenswrapper[4696]: I0318 15:43:43.071755 4696 scope.go:117] "RemoveContainer" containerID="cbd5842c80d0ed4fea3c07aaf1377dd29a1d7c3477e4a25654c2db8491bb0e05" Mar 18 15:44:00 crc kubenswrapper[4696]: I0318 15:44:00.129641 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564144-dmqz5"] Mar 18 15:44:00 crc kubenswrapper[4696]: E0318 15:44:00.130505 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5744ac11-6c36-4634-903e-298dc7b5ce45" containerName="registry" Mar 18 15:44:00 crc kubenswrapper[4696]: I0318 15:44:00.130572 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="5744ac11-6c36-4634-903e-298dc7b5ce45" containerName="registry" Mar 18 15:44:00 crc kubenswrapper[4696]: I0318 15:44:00.130699 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="5744ac11-6c36-4634-903e-298dc7b5ce45" containerName="registry" Mar 18 15:44:00 crc kubenswrapper[4696]: I0318 15:44:00.131849 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564144-dmqz5" Mar 18 15:44:00 crc kubenswrapper[4696]: I0318 15:44:00.134986 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:44:00 crc kubenswrapper[4696]: I0318 15:44:00.135909 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 15:44:00 crc kubenswrapper[4696]: I0318 15:44:00.136055 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:44:00 crc kubenswrapper[4696]: I0318 15:44:00.138554 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564144-dmqz5"] Mar 18 15:44:00 crc kubenswrapper[4696]: I0318 15:44:00.193770 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97fnx\" (UniqueName: \"kubernetes.io/projected/b71ed20f-912f-45b5-9b48-8932987d8bb0-kube-api-access-97fnx\") pod \"auto-csr-approver-29564144-dmqz5\" (UID: \"b71ed20f-912f-45b5-9b48-8932987d8bb0\") " pod="openshift-infra/auto-csr-approver-29564144-dmqz5" Mar 18 15:44:00 crc kubenswrapper[4696]: I0318 15:44:00.294666 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97fnx\" (UniqueName: \"kubernetes.io/projected/b71ed20f-912f-45b5-9b48-8932987d8bb0-kube-api-access-97fnx\") pod \"auto-csr-approver-29564144-dmqz5\" (UID: \"b71ed20f-912f-45b5-9b48-8932987d8bb0\") " pod="openshift-infra/auto-csr-approver-29564144-dmqz5" Mar 18 15:44:00 crc kubenswrapper[4696]: I0318 15:44:00.314471 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97fnx\" (UniqueName: \"kubernetes.io/projected/b71ed20f-912f-45b5-9b48-8932987d8bb0-kube-api-access-97fnx\") pod \"auto-csr-approver-29564144-dmqz5\" (UID: \"b71ed20f-912f-45b5-9b48-8932987d8bb0\") " pod="openshift-infra/auto-csr-approver-29564144-dmqz5" Mar 18 15:44:00 crc kubenswrapper[4696]: I0318 15:44:00.451490 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564144-dmqz5" Mar 18 15:44:00 crc kubenswrapper[4696]: I0318 15:44:00.604739 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564144-dmqz5"] Mar 18 15:44:00 crc kubenswrapper[4696]: I0318 15:44:00.617489 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 15:44:01 crc kubenswrapper[4696]: I0318 15:44:01.160630 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564144-dmqz5" event={"ID":"b71ed20f-912f-45b5-9b48-8932987d8bb0","Type":"ContainerStarted","Data":"f3856f298548b89bd3b2449d6fe4a51fa3bcfd40284a2d3d8545a765f94d7d80"} Mar 18 15:44:02 crc kubenswrapper[4696]: I0318 15:44:02.165911 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564144-dmqz5" event={"ID":"b71ed20f-912f-45b5-9b48-8932987d8bb0","Type":"ContainerStarted","Data":"9f7a9e75a3711fcc8ccd1a5f8e4dd218fc618fb224979b9ecff4a4fe716c241d"} Mar 18 15:44:02 crc kubenswrapper[4696]: I0318 15:44:02.184735 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564144-dmqz5" podStartSLOduration=0.981967671 podStartE2EDuration="2.184718969s" podCreationTimestamp="2026-03-18 15:44:00 +0000 UTC" firstStartedPulling="2026-03-18 15:44:00.617207949 +0000 UTC m=+483.623382155" lastFinishedPulling="2026-03-18 15:44:01.819959247 +0000 UTC m=+484.826133453" observedRunningTime="2026-03-18 15:44:02.184139064 +0000 UTC m=+485.190313270" watchObservedRunningTime="2026-03-18 15:44:02.184718969 +0000 UTC m=+485.190893175" Mar 18 15:44:03 crc kubenswrapper[4696]: I0318 15:44:03.172313 4696 generic.go:334] "Generic (PLEG): container finished" podID="b71ed20f-912f-45b5-9b48-8932987d8bb0" containerID="9f7a9e75a3711fcc8ccd1a5f8e4dd218fc618fb224979b9ecff4a4fe716c241d" exitCode=0 Mar 18 15:44:03 crc kubenswrapper[4696]: I0318 15:44:03.172386 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564144-dmqz5" event={"ID":"b71ed20f-912f-45b5-9b48-8932987d8bb0","Type":"ContainerDied","Data":"9f7a9e75a3711fcc8ccd1a5f8e4dd218fc618fb224979b9ecff4a4fe716c241d"} Mar 18 15:44:04 crc kubenswrapper[4696]: I0318 15:44:04.376499 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564144-dmqz5" Mar 18 15:44:04 crc kubenswrapper[4696]: I0318 15:44:04.449346 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97fnx\" (UniqueName: \"kubernetes.io/projected/b71ed20f-912f-45b5-9b48-8932987d8bb0-kube-api-access-97fnx\") pod \"b71ed20f-912f-45b5-9b48-8932987d8bb0\" (UID: \"b71ed20f-912f-45b5-9b48-8932987d8bb0\") " Mar 18 15:44:04 crc kubenswrapper[4696]: I0318 15:44:04.455626 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b71ed20f-912f-45b5-9b48-8932987d8bb0-kube-api-access-97fnx" (OuterVolumeSpecName: "kube-api-access-97fnx") pod "b71ed20f-912f-45b5-9b48-8932987d8bb0" (UID: "b71ed20f-912f-45b5-9b48-8932987d8bb0"). InnerVolumeSpecName "kube-api-access-97fnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:44:04 crc kubenswrapper[4696]: I0318 15:44:04.550346 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97fnx\" (UniqueName: \"kubernetes.io/projected/b71ed20f-912f-45b5-9b48-8932987d8bb0-kube-api-access-97fnx\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:05 crc kubenswrapper[4696]: I0318 15:44:05.184716 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564144-dmqz5" event={"ID":"b71ed20f-912f-45b5-9b48-8932987d8bb0","Type":"ContainerDied","Data":"f3856f298548b89bd3b2449d6fe4a51fa3bcfd40284a2d3d8545a765f94d7d80"} Mar 18 15:44:05 crc kubenswrapper[4696]: I0318 15:44:05.184754 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3856f298548b89bd3b2449d6fe4a51fa3bcfd40284a2d3d8545a765f94d7d80" Mar 18 15:44:05 crc kubenswrapper[4696]: I0318 15:44:05.184801 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564144-dmqz5" Mar 18 15:44:05 crc kubenswrapper[4696]: I0318 15:44:05.226763 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564138-jzn9j"] Mar 18 15:44:05 crc kubenswrapper[4696]: I0318 15:44:05.230776 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564138-jzn9j"] Mar 18 15:44:05 crc kubenswrapper[4696]: I0318 15:44:05.603091 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="854ec8b0-a321-4bcb-9327-96742fec3f31" path="/var/lib/kubelet/pods/854ec8b0-a321-4bcb-9327-96742fec3f31/volumes" Mar 18 15:45:00 crc kubenswrapper[4696]: I0318 15:45:00.143468 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564145-6jvbt"] Mar 18 15:45:00 crc kubenswrapper[4696]: E0318 15:45:00.145012 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b71ed20f-912f-45b5-9b48-8932987d8bb0" containerName="oc" Mar 18 15:45:00 crc kubenswrapper[4696]: I0318 15:45:00.145052 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="b71ed20f-912f-45b5-9b48-8932987d8bb0" containerName="oc" Mar 18 15:45:00 crc kubenswrapper[4696]: I0318 15:45:00.145322 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="b71ed20f-912f-45b5-9b48-8932987d8bb0" containerName="oc" Mar 18 15:45:00 crc kubenswrapper[4696]: I0318 15:45:00.146149 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-6jvbt" Mar 18 15:45:00 crc kubenswrapper[4696]: I0318 15:45:00.148362 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 15:45:00 crc kubenswrapper[4696]: I0318 15:45:00.149787 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 15:45:00 crc kubenswrapper[4696]: I0318 15:45:00.154002 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564145-6jvbt"] Mar 18 15:45:00 crc kubenswrapper[4696]: I0318 15:45:00.280070 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fd2f649-6dd8-413f-85fe-36cd6e6cea88-config-volume\") pod \"collect-profiles-29564145-6jvbt\" (UID: \"5fd2f649-6dd8-413f-85fe-36cd6e6cea88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-6jvbt" Mar 18 15:45:00 crc kubenswrapper[4696]: I0318 15:45:00.280144 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwf8n\" (UniqueName: \"kubernetes.io/projected/5fd2f649-6dd8-413f-85fe-36cd6e6cea88-kube-api-access-cwf8n\") pod \"collect-profiles-29564145-6jvbt\" (UID: \"5fd2f649-6dd8-413f-85fe-36cd6e6cea88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-6jvbt" Mar 18 15:45:00 crc kubenswrapper[4696]: I0318 15:45:00.280193 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fd2f649-6dd8-413f-85fe-36cd6e6cea88-secret-volume\") pod \"collect-profiles-29564145-6jvbt\" (UID: \"5fd2f649-6dd8-413f-85fe-36cd6e6cea88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-6jvbt" Mar 18 15:45:00 crc kubenswrapper[4696]: I0318 15:45:00.381122 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fd2f649-6dd8-413f-85fe-36cd6e6cea88-config-volume\") pod \"collect-profiles-29564145-6jvbt\" (UID: \"5fd2f649-6dd8-413f-85fe-36cd6e6cea88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-6jvbt" Mar 18 15:45:00 crc kubenswrapper[4696]: I0318 15:45:00.381204 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwf8n\" (UniqueName: \"kubernetes.io/projected/5fd2f649-6dd8-413f-85fe-36cd6e6cea88-kube-api-access-cwf8n\") pod \"collect-profiles-29564145-6jvbt\" (UID: \"5fd2f649-6dd8-413f-85fe-36cd6e6cea88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-6jvbt" Mar 18 15:45:00 crc kubenswrapper[4696]: I0318 15:45:00.381248 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fd2f649-6dd8-413f-85fe-36cd6e6cea88-secret-volume\") pod \"collect-profiles-29564145-6jvbt\" (UID: \"5fd2f649-6dd8-413f-85fe-36cd6e6cea88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-6jvbt" Mar 18 15:45:00 crc kubenswrapper[4696]: I0318 15:45:00.382676 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fd2f649-6dd8-413f-85fe-36cd6e6cea88-config-volume\") pod \"collect-profiles-29564145-6jvbt\" (UID: \"5fd2f649-6dd8-413f-85fe-36cd6e6cea88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-6jvbt" Mar 18 15:45:00 crc kubenswrapper[4696]: I0318 15:45:00.393964 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fd2f649-6dd8-413f-85fe-36cd6e6cea88-secret-volume\") pod \"collect-profiles-29564145-6jvbt\" (UID: \"5fd2f649-6dd8-413f-85fe-36cd6e6cea88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-6jvbt" Mar 18 15:45:00 crc kubenswrapper[4696]: I0318 15:45:00.407665 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwf8n\" (UniqueName: \"kubernetes.io/projected/5fd2f649-6dd8-413f-85fe-36cd6e6cea88-kube-api-access-cwf8n\") pod \"collect-profiles-29564145-6jvbt\" (UID: \"5fd2f649-6dd8-413f-85fe-36cd6e6cea88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-6jvbt" Mar 18 15:45:00 crc kubenswrapper[4696]: I0318 15:45:00.466572 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-6jvbt" Mar 18 15:45:00 crc kubenswrapper[4696]: I0318 15:45:00.693164 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564145-6jvbt"] Mar 18 15:45:01 crc kubenswrapper[4696]: I0318 15:45:01.523699 4696 generic.go:334] "Generic (PLEG): container finished" podID="5fd2f649-6dd8-413f-85fe-36cd6e6cea88" containerID="37da1252302ec3ef393f23519f05d6f604ae011e69b569458030fa3e62f7d98b" exitCode=0 Mar 18 15:45:01 crc kubenswrapper[4696]: I0318 15:45:01.523766 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-6jvbt" event={"ID":"5fd2f649-6dd8-413f-85fe-36cd6e6cea88","Type":"ContainerDied","Data":"37da1252302ec3ef393f23519f05d6f604ae011e69b569458030fa3e62f7d98b"} Mar 18 15:45:01 crc kubenswrapper[4696]: I0318 15:45:01.524166 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-6jvbt" event={"ID":"5fd2f649-6dd8-413f-85fe-36cd6e6cea88","Type":"ContainerStarted","Data":"d47b48044a85baa61e32b6c9f9369a0beeb71c7ee67787404d40638eb868b1f2"} Mar 18 15:45:02 crc kubenswrapper[4696]: I0318 15:45:02.761642 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-6jvbt" Mar 18 15:45:02 crc kubenswrapper[4696]: I0318 15:45:02.825987 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fd2f649-6dd8-413f-85fe-36cd6e6cea88-secret-volume\") pod \"5fd2f649-6dd8-413f-85fe-36cd6e6cea88\" (UID: \"5fd2f649-6dd8-413f-85fe-36cd6e6cea88\") " Mar 18 15:45:02 crc kubenswrapper[4696]: I0318 15:45:02.826055 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fd2f649-6dd8-413f-85fe-36cd6e6cea88-config-volume\") pod \"5fd2f649-6dd8-413f-85fe-36cd6e6cea88\" (UID: \"5fd2f649-6dd8-413f-85fe-36cd6e6cea88\") " Mar 18 15:45:02 crc kubenswrapper[4696]: I0318 15:45:02.826132 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwf8n\" (UniqueName: \"kubernetes.io/projected/5fd2f649-6dd8-413f-85fe-36cd6e6cea88-kube-api-access-cwf8n\") pod \"5fd2f649-6dd8-413f-85fe-36cd6e6cea88\" (UID: \"5fd2f649-6dd8-413f-85fe-36cd6e6cea88\") " Mar 18 15:45:02 crc kubenswrapper[4696]: I0318 15:45:02.827794 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fd2f649-6dd8-413f-85fe-36cd6e6cea88-config-volume" (OuterVolumeSpecName: "config-volume") pod "5fd2f649-6dd8-413f-85fe-36cd6e6cea88" (UID: "5fd2f649-6dd8-413f-85fe-36cd6e6cea88"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:45:02 crc kubenswrapper[4696]: I0318 15:45:02.834595 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fd2f649-6dd8-413f-85fe-36cd6e6cea88-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5fd2f649-6dd8-413f-85fe-36cd6e6cea88" (UID: "5fd2f649-6dd8-413f-85fe-36cd6e6cea88"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:45:02 crc kubenswrapper[4696]: I0318 15:45:02.834773 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fd2f649-6dd8-413f-85fe-36cd6e6cea88-kube-api-access-cwf8n" (OuterVolumeSpecName: "kube-api-access-cwf8n") pod "5fd2f649-6dd8-413f-85fe-36cd6e6cea88" (UID: "5fd2f649-6dd8-413f-85fe-36cd6e6cea88"). InnerVolumeSpecName "kube-api-access-cwf8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:45:02 crc kubenswrapper[4696]: I0318 15:45:02.927897 4696 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fd2f649-6dd8-413f-85fe-36cd6e6cea88-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 15:45:02 crc kubenswrapper[4696]: I0318 15:45:02.927963 4696 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fd2f649-6dd8-413f-85fe-36cd6e6cea88-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 15:45:02 crc kubenswrapper[4696]: I0318 15:45:02.927977 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwf8n\" (UniqueName: \"kubernetes.io/projected/5fd2f649-6dd8-413f-85fe-36cd6e6cea88-kube-api-access-cwf8n\") on node \"crc\" DevicePath \"\"" Mar 18 15:45:03 crc kubenswrapper[4696]: I0318 15:45:03.535637 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-6jvbt" event={"ID":"5fd2f649-6dd8-413f-85fe-36cd6e6cea88","Type":"ContainerDied","Data":"d47b48044a85baa61e32b6c9f9369a0beeb71c7ee67787404d40638eb868b1f2"} Mar 18 15:45:03 crc kubenswrapper[4696]: I0318 15:45:03.535910 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d47b48044a85baa61e32b6c9f9369a0beeb71c7ee67787404d40638eb868b1f2" Mar 18 15:45:03 crc kubenswrapper[4696]: I0318 15:45:03.535705 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-6jvbt" Mar 18 15:45:06 crc kubenswrapper[4696]: I0318 15:45:06.543990 4696 scope.go:117] "RemoveContainer" containerID="e85e3d189cdcdbdb9cd340e02a41b68cda153c926b52711b9f8e181d7b9ea3b1" Mar 18 15:45:42 crc kubenswrapper[4696]: I0318 15:45:42.184281 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:45:42 crc kubenswrapper[4696]: I0318 15:45:42.185108 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:46:00 crc kubenswrapper[4696]: I0318 15:46:00.144804 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564146-6nclf"] Mar 18 15:46:00 crc kubenswrapper[4696]: E0318 15:46:00.146017 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fd2f649-6dd8-413f-85fe-36cd6e6cea88" containerName="collect-profiles" Mar 18 15:46:00 crc kubenswrapper[4696]: I0318 15:46:00.146034 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fd2f649-6dd8-413f-85fe-36cd6e6cea88" containerName="collect-profiles" Mar 18 15:46:00 crc kubenswrapper[4696]: I0318 15:46:00.146156 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fd2f649-6dd8-413f-85fe-36cd6e6cea88" containerName="collect-profiles" Mar 18 15:46:00 crc kubenswrapper[4696]: I0318 15:46:00.146649 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564146-6nclf" Mar 18 15:46:00 crc kubenswrapper[4696]: I0318 15:46:00.149363 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 15:46:00 crc kubenswrapper[4696]: I0318 15:46:00.149569 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:46:00 crc kubenswrapper[4696]: I0318 15:46:00.150072 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:46:00 crc kubenswrapper[4696]: I0318 15:46:00.162295 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564146-6nclf"] Mar 18 15:46:00 crc kubenswrapper[4696]: I0318 15:46:00.255609 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn7km\" (UniqueName: \"kubernetes.io/projected/106545a2-6035-42b7-ac34-45bcbb0451ed-kube-api-access-sn7km\") pod \"auto-csr-approver-29564146-6nclf\" (UID: \"106545a2-6035-42b7-ac34-45bcbb0451ed\") " pod="openshift-infra/auto-csr-approver-29564146-6nclf" Mar 18 15:46:00 crc kubenswrapper[4696]: I0318 15:46:00.357197 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn7km\" (UniqueName: \"kubernetes.io/projected/106545a2-6035-42b7-ac34-45bcbb0451ed-kube-api-access-sn7km\") pod \"auto-csr-approver-29564146-6nclf\" (UID: \"106545a2-6035-42b7-ac34-45bcbb0451ed\") " pod="openshift-infra/auto-csr-approver-29564146-6nclf" Mar 18 15:46:00 crc kubenswrapper[4696]: I0318 15:46:00.378481 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn7km\" (UniqueName: \"kubernetes.io/projected/106545a2-6035-42b7-ac34-45bcbb0451ed-kube-api-access-sn7km\") pod \"auto-csr-approver-29564146-6nclf\" (UID: \"106545a2-6035-42b7-ac34-45bcbb0451ed\") " pod="openshift-infra/auto-csr-approver-29564146-6nclf" Mar 18 15:46:00 crc kubenswrapper[4696]: I0318 15:46:00.464948 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564146-6nclf" Mar 18 15:46:00 crc kubenswrapper[4696]: I0318 15:46:00.680333 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564146-6nclf"] Mar 18 15:46:00 crc kubenswrapper[4696]: I0318 15:46:00.851768 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564146-6nclf" event={"ID":"106545a2-6035-42b7-ac34-45bcbb0451ed","Type":"ContainerStarted","Data":"3230ec3d7792ac10e5d403edcd86dca0c87682aa302ab8e1a3412e5ff60db599"} Mar 18 15:46:02 crc kubenswrapper[4696]: I0318 15:46:02.864473 4696 generic.go:334] "Generic (PLEG): container finished" podID="106545a2-6035-42b7-ac34-45bcbb0451ed" containerID="968e99afcac02b809d78b9e9dcd9b5d1b1f16f9506bf67c5e6c3014f292ea646" exitCode=0 Mar 18 15:46:02 crc kubenswrapper[4696]: I0318 15:46:02.864567 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564146-6nclf" event={"ID":"106545a2-6035-42b7-ac34-45bcbb0451ed","Type":"ContainerDied","Data":"968e99afcac02b809d78b9e9dcd9b5d1b1f16f9506bf67c5e6c3014f292ea646"} Mar 18 15:46:04 crc kubenswrapper[4696]: I0318 15:46:04.069304 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564146-6nclf" Mar 18 15:46:04 crc kubenswrapper[4696]: I0318 15:46:04.211491 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn7km\" (UniqueName: \"kubernetes.io/projected/106545a2-6035-42b7-ac34-45bcbb0451ed-kube-api-access-sn7km\") pod \"106545a2-6035-42b7-ac34-45bcbb0451ed\" (UID: \"106545a2-6035-42b7-ac34-45bcbb0451ed\") " Mar 18 15:46:04 crc kubenswrapper[4696]: I0318 15:46:04.218640 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/106545a2-6035-42b7-ac34-45bcbb0451ed-kube-api-access-sn7km" (OuterVolumeSpecName: "kube-api-access-sn7km") pod "106545a2-6035-42b7-ac34-45bcbb0451ed" (UID: "106545a2-6035-42b7-ac34-45bcbb0451ed"). InnerVolumeSpecName "kube-api-access-sn7km". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:46:04 crc kubenswrapper[4696]: I0318 15:46:04.312949 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn7km\" (UniqueName: \"kubernetes.io/projected/106545a2-6035-42b7-ac34-45bcbb0451ed-kube-api-access-sn7km\") on node \"crc\" DevicePath \"\"" Mar 18 15:46:04 crc kubenswrapper[4696]: I0318 15:46:04.878421 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564146-6nclf" event={"ID":"106545a2-6035-42b7-ac34-45bcbb0451ed","Type":"ContainerDied","Data":"3230ec3d7792ac10e5d403edcd86dca0c87682aa302ab8e1a3412e5ff60db599"} Mar 18 15:46:04 crc kubenswrapper[4696]: I0318 15:46:04.878474 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3230ec3d7792ac10e5d403edcd86dca0c87682aa302ab8e1a3412e5ff60db599" Mar 18 15:46:04 crc kubenswrapper[4696]: I0318 15:46:04.878574 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564146-6nclf" Mar 18 15:46:05 crc kubenswrapper[4696]: I0318 15:46:05.137413 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564140-vkm8r"] Mar 18 15:46:05 crc kubenswrapper[4696]: I0318 15:46:05.142578 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564140-vkm8r"] Mar 18 15:46:05 crc kubenswrapper[4696]: I0318 15:46:05.606860 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2528428-1e7c-49d7-8f64-d38ec08d18a7" path="/var/lib/kubelet/pods/a2528428-1e7c-49d7-8f64-d38ec08d18a7/volumes" Mar 18 15:46:06 crc kubenswrapper[4696]: I0318 15:46:06.582718 4696 scope.go:117] "RemoveContainer" containerID="5958a3f0397e5e03914c295d0124da20a9ac1aae5fb7b12a96328e1d63529ec2" Mar 18 15:46:12 crc kubenswrapper[4696]: I0318 15:46:12.185309 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:46:12 crc kubenswrapper[4696]: I0318 15:46:12.185966 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:46:42 crc kubenswrapper[4696]: I0318 15:46:42.184977 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:46:42 crc kubenswrapper[4696]: I0318 15:46:42.185622 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:46:42 crc kubenswrapper[4696]: I0318 15:46:42.185672 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 15:46:42 crc kubenswrapper[4696]: I0318 15:46:42.186259 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea4660abb995b59b3411f3109e892dc576f9d2448c545e6fbdbaeccd931d4b49"} pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 15:46:42 crc kubenswrapper[4696]: I0318 15:46:42.186316 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" containerID="cri-o://ea4660abb995b59b3411f3109e892dc576f9d2448c545e6fbdbaeccd931d4b49" gracePeriod=600 Mar 18 15:46:43 crc kubenswrapper[4696]: I0318 15:46:43.110453 4696 generic.go:334] "Generic (PLEG): container finished" podID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerID="ea4660abb995b59b3411f3109e892dc576f9d2448c545e6fbdbaeccd931d4b49" exitCode=0 Mar 18 15:46:43 crc kubenswrapper[4696]: I0318 15:46:43.110552 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerDied","Data":"ea4660abb995b59b3411f3109e892dc576f9d2448c545e6fbdbaeccd931d4b49"} Mar 18 15:46:43 crc kubenswrapper[4696]: I0318 15:46:43.110943 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerStarted","Data":"1b4bf50aa0e21fc64691951c3d86757dea376b50f966710c0ef2ad26f0257226"} Mar 18 15:46:43 crc kubenswrapper[4696]: I0318 15:46:43.110972 4696 scope.go:117] "RemoveContainer" containerID="a757e73374a5d2719000efd9790c0f7bc244089cee9596be88ce0ad60115e2a2" Mar 18 15:47:06 crc kubenswrapper[4696]: I0318 15:47:06.644189 4696 scope.go:117] "RemoveContainer" containerID="bf4ca4aca02cf88e131f2eb9d2bfceeaf570ffdadc28bf974e59c3bd0e189140" Mar 18 15:48:00 crc kubenswrapper[4696]: I0318 15:48:00.135239 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564148-cf8xr"] Mar 18 15:48:00 crc kubenswrapper[4696]: E0318 15:48:00.136078 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="106545a2-6035-42b7-ac34-45bcbb0451ed" containerName="oc" Mar 18 15:48:00 crc kubenswrapper[4696]: I0318 15:48:00.136094 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="106545a2-6035-42b7-ac34-45bcbb0451ed" containerName="oc" Mar 18 15:48:00 crc kubenswrapper[4696]: I0318 15:48:00.136227 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="106545a2-6035-42b7-ac34-45bcbb0451ed" containerName="oc" Mar 18 15:48:00 crc kubenswrapper[4696]: I0318 15:48:00.136662 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564148-cf8xr" Mar 18 15:48:00 crc kubenswrapper[4696]: I0318 15:48:00.139115 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:48:00 crc kubenswrapper[4696]: I0318 15:48:00.139175 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 15:48:00 crc kubenswrapper[4696]: I0318 15:48:00.140321 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:48:00 crc kubenswrapper[4696]: I0318 15:48:00.144671 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564148-cf8xr"] Mar 18 15:48:00 crc kubenswrapper[4696]: I0318 15:48:00.317388 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fzmh\" (UniqueName: \"kubernetes.io/projected/f7f8828e-3b38-4c94-80ed-bb354c8be9d1-kube-api-access-5fzmh\") pod \"auto-csr-approver-29564148-cf8xr\" (UID: \"f7f8828e-3b38-4c94-80ed-bb354c8be9d1\") " pod="openshift-infra/auto-csr-approver-29564148-cf8xr" Mar 18 15:48:00 crc kubenswrapper[4696]: I0318 15:48:00.418471 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fzmh\" (UniqueName: \"kubernetes.io/projected/f7f8828e-3b38-4c94-80ed-bb354c8be9d1-kube-api-access-5fzmh\") pod \"auto-csr-approver-29564148-cf8xr\" (UID: \"f7f8828e-3b38-4c94-80ed-bb354c8be9d1\") " pod="openshift-infra/auto-csr-approver-29564148-cf8xr" Mar 18 15:48:00 crc kubenswrapper[4696]: I0318 15:48:00.438479 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fzmh\" (UniqueName: \"kubernetes.io/projected/f7f8828e-3b38-4c94-80ed-bb354c8be9d1-kube-api-access-5fzmh\") pod \"auto-csr-approver-29564148-cf8xr\" (UID: \"f7f8828e-3b38-4c94-80ed-bb354c8be9d1\") " pod="openshift-infra/auto-csr-approver-29564148-cf8xr" Mar 18 15:48:00 crc kubenswrapper[4696]: I0318 15:48:00.510915 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564148-cf8xr" Mar 18 15:48:00 crc kubenswrapper[4696]: I0318 15:48:00.706034 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564148-cf8xr"] Mar 18 15:48:01 crc kubenswrapper[4696]: I0318 15:48:01.592295 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564148-cf8xr" event={"ID":"f7f8828e-3b38-4c94-80ed-bb354c8be9d1","Type":"ContainerStarted","Data":"e5388378c2479d7ddb7899e9d0b503d2cd4b1dfb94f911a0cc522274d9356e67"} Mar 18 15:48:02 crc kubenswrapper[4696]: I0318 15:48:02.600049 4696 generic.go:334] "Generic (PLEG): container finished" podID="f7f8828e-3b38-4c94-80ed-bb354c8be9d1" containerID="3036c3d85e69fe08d278b0c5a9e67004b058f6abbb75fc52b22058bf268ae98c" exitCode=0 Mar 18 15:48:02 crc kubenswrapper[4696]: I0318 15:48:02.600450 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564148-cf8xr" event={"ID":"f7f8828e-3b38-4c94-80ed-bb354c8be9d1","Type":"ContainerDied","Data":"3036c3d85e69fe08d278b0c5a9e67004b058f6abbb75fc52b22058bf268ae98c"} Mar 18 15:48:03 crc kubenswrapper[4696]: I0318 15:48:03.894045 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564148-cf8xr" Mar 18 15:48:04 crc kubenswrapper[4696]: I0318 15:48:04.066673 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fzmh\" (UniqueName: \"kubernetes.io/projected/f7f8828e-3b38-4c94-80ed-bb354c8be9d1-kube-api-access-5fzmh\") pod \"f7f8828e-3b38-4c94-80ed-bb354c8be9d1\" (UID: \"f7f8828e-3b38-4c94-80ed-bb354c8be9d1\") " Mar 18 15:48:04 crc kubenswrapper[4696]: I0318 15:48:04.074097 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7f8828e-3b38-4c94-80ed-bb354c8be9d1-kube-api-access-5fzmh" (OuterVolumeSpecName: "kube-api-access-5fzmh") pod "f7f8828e-3b38-4c94-80ed-bb354c8be9d1" (UID: "f7f8828e-3b38-4c94-80ed-bb354c8be9d1"). InnerVolumeSpecName "kube-api-access-5fzmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:48:04 crc kubenswrapper[4696]: I0318 15:48:04.167973 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fzmh\" (UniqueName: \"kubernetes.io/projected/f7f8828e-3b38-4c94-80ed-bb354c8be9d1-kube-api-access-5fzmh\") on node \"crc\" DevicePath \"\"" Mar 18 15:48:04 crc kubenswrapper[4696]: I0318 15:48:04.623403 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564148-cf8xr" event={"ID":"f7f8828e-3b38-4c94-80ed-bb354c8be9d1","Type":"ContainerDied","Data":"e5388378c2479d7ddb7899e9d0b503d2cd4b1dfb94f911a0cc522274d9356e67"} Mar 18 15:48:04 crc kubenswrapper[4696]: I0318 15:48:04.623931 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5388378c2479d7ddb7899e9d0b503d2cd4b1dfb94f911a0cc522274d9356e67" Mar 18 15:48:04 crc kubenswrapper[4696]: I0318 15:48:04.623481 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564148-cf8xr" Mar 18 15:48:04 crc kubenswrapper[4696]: I0318 15:48:04.956463 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564142-p5rzg"] Mar 18 15:48:04 crc kubenswrapper[4696]: I0318 15:48:04.961000 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564142-p5rzg"] Mar 18 15:48:05 crc kubenswrapper[4696]: I0318 15:48:05.604564 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9646518f-1f05-4cb7-a921-5cd0e7af7b4b" path="/var/lib/kubelet/pods/9646518f-1f05-4cb7-a921-5cd0e7af7b4b/volumes" Mar 18 15:48:06 crc kubenswrapper[4696]: I0318 15:48:06.703006 4696 scope.go:117] "RemoveContainer" containerID="5ffd7de8b7b7c0823b55a77f47c2d71a383752769df88c54e14f7a9eb06943c0" Mar 18 15:48:42 crc kubenswrapper[4696]: I0318 15:48:42.185138 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:48:42 crc kubenswrapper[4696]: I0318 15:48:42.186137 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:49:02 crc kubenswrapper[4696]: I0318 15:49:02.362799 4696 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 15:49:12 crc kubenswrapper[4696]: I0318 15:49:12.184895 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:49:12 crc kubenswrapper[4696]: I0318 15:49:12.187569 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:49:12 crc kubenswrapper[4696]: I0318 15:49:12.314283 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9lz4j"] Mar 18 15:49:12 crc kubenswrapper[4696]: E0318 15:49:12.314926 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7f8828e-3b38-4c94-80ed-bb354c8be9d1" containerName="oc" Mar 18 15:49:12 crc kubenswrapper[4696]: I0318 15:49:12.314999 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7f8828e-3b38-4c94-80ed-bb354c8be9d1" containerName="oc" Mar 18 15:49:12 crc kubenswrapper[4696]: I0318 15:49:12.315189 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7f8828e-3b38-4c94-80ed-bb354c8be9d1" containerName="oc" Mar 18 15:49:12 crc kubenswrapper[4696]: I0318 15:49:12.316182 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9lz4j" Mar 18 15:49:12 crc kubenswrapper[4696]: I0318 15:49:12.323408 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9lz4j"] Mar 18 15:49:12 crc kubenswrapper[4696]: I0318 15:49:12.407981 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f-utilities\") pod \"redhat-marketplace-9lz4j\" (UID: \"9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f\") " pod="openshift-marketplace/redhat-marketplace-9lz4j" Mar 18 15:49:12 crc kubenswrapper[4696]: I0318 15:49:12.408046 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f-catalog-content\") pod \"redhat-marketplace-9lz4j\" (UID: \"9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f\") " pod="openshift-marketplace/redhat-marketplace-9lz4j" Mar 18 15:49:12 crc kubenswrapper[4696]: I0318 15:49:12.408092 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rnw5\" (UniqueName: \"kubernetes.io/projected/9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f-kube-api-access-8rnw5\") pod \"redhat-marketplace-9lz4j\" (UID: \"9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f\") " pod="openshift-marketplace/redhat-marketplace-9lz4j" Mar 18 15:49:12 crc kubenswrapper[4696]: I0318 15:49:12.509090 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f-utilities\") pod \"redhat-marketplace-9lz4j\" (UID: \"9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f\") " pod="openshift-marketplace/redhat-marketplace-9lz4j" Mar 18 15:49:12 crc kubenswrapper[4696]: I0318 15:49:12.509482 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f-catalog-content\") pod \"redhat-marketplace-9lz4j\" (UID: \"9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f\") " pod="openshift-marketplace/redhat-marketplace-9lz4j" Mar 18 15:49:12 crc kubenswrapper[4696]: I0318 15:49:12.509651 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rnw5\" (UniqueName: \"kubernetes.io/projected/9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f-kube-api-access-8rnw5\") pod \"redhat-marketplace-9lz4j\" (UID: \"9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f\") " pod="openshift-marketplace/redhat-marketplace-9lz4j" Mar 18 15:49:12 crc kubenswrapper[4696]: I0318 15:49:12.509687 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f-utilities\") pod \"redhat-marketplace-9lz4j\" (UID: \"9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f\") " pod="openshift-marketplace/redhat-marketplace-9lz4j" Mar 18 15:49:12 crc kubenswrapper[4696]: I0318 15:49:12.510000 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f-catalog-content\") pod \"redhat-marketplace-9lz4j\" (UID: \"9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f\") " pod="openshift-marketplace/redhat-marketplace-9lz4j" Mar 18 15:49:12 crc kubenswrapper[4696]: I0318 15:49:12.548264 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rnw5\" (UniqueName: \"kubernetes.io/projected/9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f-kube-api-access-8rnw5\") pod \"redhat-marketplace-9lz4j\" (UID: \"9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f\") " pod="openshift-marketplace/redhat-marketplace-9lz4j" Mar 18 15:49:12 crc kubenswrapper[4696]: I0318 15:49:12.633744 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9lz4j" Mar 18 15:49:12 crc kubenswrapper[4696]: I0318 15:49:12.928807 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9lz4j"] Mar 18 15:49:13 crc kubenswrapper[4696]: I0318 15:49:13.037311 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9lz4j" event={"ID":"9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f","Type":"ContainerStarted","Data":"7a2fb3a291cdd027562e301c68cdf7192c911f50c0af710bc2b5a821a7d8a42b"} Mar 18 15:49:14 crc kubenswrapper[4696]: I0318 15:49:14.048790 4696 generic.go:334] "Generic (PLEG): container finished" podID="9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f" containerID="f62d778e04045c3984d72e3242e4af66231f0825a7c7cc2a05a0e946c2b273f9" exitCode=0 Mar 18 15:49:14 crc kubenswrapper[4696]: I0318 15:49:14.048925 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9lz4j" event={"ID":"9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f","Type":"ContainerDied","Data":"f62d778e04045c3984d72e3242e4af66231f0825a7c7cc2a05a0e946c2b273f9"} Mar 18 15:49:14 crc kubenswrapper[4696]: I0318 15:49:14.051246 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 15:49:15 crc kubenswrapper[4696]: I0318 15:49:15.057185 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9lz4j" event={"ID":"9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f","Type":"ContainerStarted","Data":"610fde0743aeeb8b2d4e9cbf5ea44cd0e363f40642db8f986a1b778c8f65df02"} Mar 18 15:49:16 crc kubenswrapper[4696]: I0318 15:49:16.066839 4696 generic.go:334] "Generic (PLEG): container finished" podID="9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f" containerID="610fde0743aeeb8b2d4e9cbf5ea44cd0e363f40642db8f986a1b778c8f65df02" exitCode=0 Mar 18 15:49:16 crc kubenswrapper[4696]: I0318 15:49:16.066917 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9lz4j" event={"ID":"9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f","Type":"ContainerDied","Data":"610fde0743aeeb8b2d4e9cbf5ea44cd0e363f40642db8f986a1b778c8f65df02"} Mar 18 15:49:17 crc kubenswrapper[4696]: I0318 15:49:17.077486 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9lz4j" event={"ID":"9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f","Type":"ContainerStarted","Data":"fd0e03404820c81a4ab64df80a05eda4a18f57560a008d7b43df9a819f7ac8b0"} Mar 18 15:49:17 crc kubenswrapper[4696]: I0318 15:49:17.105139 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9lz4j" podStartSLOduration=2.407607109 podStartE2EDuration="5.105106038s" podCreationTimestamp="2026-03-18 15:49:12 +0000 UTC" firstStartedPulling="2026-03-18 15:49:14.050681527 +0000 UTC m=+797.056855733" lastFinishedPulling="2026-03-18 15:49:16.748180436 +0000 UTC m=+799.754354662" observedRunningTime="2026-03-18 15:49:17.101307983 +0000 UTC m=+800.107482189" watchObservedRunningTime="2026-03-18 15:49:17.105106038 +0000 UTC m=+800.111280244" Mar 18 15:49:22 crc kubenswrapper[4696]: I0318 15:49:22.633952 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9lz4j" Mar 18 15:49:22 crc kubenswrapper[4696]: I0318 15:49:22.634410 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9lz4j" Mar 18 15:49:22 crc kubenswrapper[4696]: I0318 15:49:22.675884 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9lz4j" Mar 18 15:49:23 crc kubenswrapper[4696]: I0318 15:49:23.160246 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9lz4j" Mar 18 15:49:23 crc kubenswrapper[4696]: I0318 15:49:23.223787 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9lz4j"] Mar 18 15:49:25 crc kubenswrapper[4696]: I0318 15:49:25.133375 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9lz4j" podUID="9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f" containerName="registry-server" containerID="cri-o://fd0e03404820c81a4ab64df80a05eda4a18f57560a008d7b43df9a819f7ac8b0" gracePeriod=2 Mar 18 15:49:25 crc kubenswrapper[4696]: I0318 15:49:25.523683 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9lz4j" Mar 18 15:49:25 crc kubenswrapper[4696]: I0318 15:49:25.633440 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f-utilities\") pod \"9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f\" (UID: \"9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f\") " Mar 18 15:49:25 crc kubenswrapper[4696]: I0318 15:49:25.634061 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rnw5\" (UniqueName: \"kubernetes.io/projected/9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f-kube-api-access-8rnw5\") pod \"9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f\" (UID: \"9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f\") " Mar 18 15:49:25 crc kubenswrapper[4696]: I0318 15:49:25.634295 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f-catalog-content\") pod \"9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f\" (UID: \"9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f\") " Mar 18 15:49:25 crc kubenswrapper[4696]: I0318 15:49:25.634581 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f-utilities" (OuterVolumeSpecName: "utilities") pod "9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f" (UID: "9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:49:25 crc kubenswrapper[4696]: I0318 15:49:25.641165 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f-kube-api-access-8rnw5" (OuterVolumeSpecName: "kube-api-access-8rnw5") pod "9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f" (UID: "9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f"). InnerVolumeSpecName "kube-api-access-8rnw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:49:25 crc kubenswrapper[4696]: I0318 15:49:25.666147 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f" (UID: "9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:49:25 crc kubenswrapper[4696]: I0318 15:49:25.738712 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rnw5\" (UniqueName: \"kubernetes.io/projected/9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f-kube-api-access-8rnw5\") on node \"crc\" DevicePath \"\"" Mar 18 15:49:25 crc kubenswrapper[4696]: I0318 15:49:25.738760 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:49:25 crc kubenswrapper[4696]: I0318 15:49:25.738771 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:49:26 crc kubenswrapper[4696]: I0318 15:49:26.143974 4696 generic.go:334] "Generic (PLEG): container finished" podID="9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f" containerID="fd0e03404820c81a4ab64df80a05eda4a18f57560a008d7b43df9a819f7ac8b0" exitCode=0 Mar 18 15:49:26 crc kubenswrapper[4696]: I0318 15:49:26.144044 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9lz4j" event={"ID":"9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f","Type":"ContainerDied","Data":"fd0e03404820c81a4ab64df80a05eda4a18f57560a008d7b43df9a819f7ac8b0"} Mar 18 15:49:26 crc kubenswrapper[4696]: I0318 15:49:26.144087 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9lz4j" event={"ID":"9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f","Type":"ContainerDied","Data":"7a2fb3a291cdd027562e301c68cdf7192c911f50c0af710bc2b5a821a7d8a42b"} Mar 18 15:49:26 crc kubenswrapper[4696]: I0318 15:49:26.144088 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9lz4j" Mar 18 15:49:26 crc kubenswrapper[4696]: I0318 15:49:26.144114 4696 scope.go:117] "RemoveContainer" containerID="fd0e03404820c81a4ab64df80a05eda4a18f57560a008d7b43df9a819f7ac8b0" Mar 18 15:49:26 crc kubenswrapper[4696]: I0318 15:49:26.171834 4696 scope.go:117] "RemoveContainer" containerID="610fde0743aeeb8b2d4e9cbf5ea44cd0e363f40642db8f986a1b778c8f65df02" Mar 18 15:49:26 crc kubenswrapper[4696]: I0318 15:49:26.195536 4696 scope.go:117] "RemoveContainer" containerID="f62d778e04045c3984d72e3242e4af66231f0825a7c7cc2a05a0e946c2b273f9" Mar 18 15:49:26 crc kubenswrapper[4696]: I0318 15:49:26.199436 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9lz4j"] Mar 18 15:49:26 crc kubenswrapper[4696]: I0318 15:49:26.207795 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9lz4j"] Mar 18 15:49:26 crc kubenswrapper[4696]: I0318 15:49:26.220941 4696 scope.go:117] "RemoveContainer" containerID="fd0e03404820c81a4ab64df80a05eda4a18f57560a008d7b43df9a819f7ac8b0" Mar 18 15:49:26 crc kubenswrapper[4696]: E0318 15:49:26.225240 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd0e03404820c81a4ab64df80a05eda4a18f57560a008d7b43df9a819f7ac8b0\": container with ID starting with fd0e03404820c81a4ab64df80a05eda4a18f57560a008d7b43df9a819f7ac8b0 not found: ID does not exist" containerID="fd0e03404820c81a4ab64df80a05eda4a18f57560a008d7b43df9a819f7ac8b0" Mar 18 15:49:26 crc kubenswrapper[4696]: I0318 15:49:26.225318 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd0e03404820c81a4ab64df80a05eda4a18f57560a008d7b43df9a819f7ac8b0"} err="failed to get container status \"fd0e03404820c81a4ab64df80a05eda4a18f57560a008d7b43df9a819f7ac8b0\": rpc error: code = NotFound desc = could not find container \"fd0e03404820c81a4ab64df80a05eda4a18f57560a008d7b43df9a819f7ac8b0\": container with ID starting with fd0e03404820c81a4ab64df80a05eda4a18f57560a008d7b43df9a819f7ac8b0 not found: ID does not exist" Mar 18 15:49:26 crc kubenswrapper[4696]: I0318 15:49:26.225368 4696 scope.go:117] "RemoveContainer" containerID="610fde0743aeeb8b2d4e9cbf5ea44cd0e363f40642db8f986a1b778c8f65df02" Mar 18 15:49:26 crc kubenswrapper[4696]: E0318 15:49:26.225976 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"610fde0743aeeb8b2d4e9cbf5ea44cd0e363f40642db8f986a1b778c8f65df02\": container with ID starting with 610fde0743aeeb8b2d4e9cbf5ea44cd0e363f40642db8f986a1b778c8f65df02 not found: ID does not exist" containerID="610fde0743aeeb8b2d4e9cbf5ea44cd0e363f40642db8f986a1b778c8f65df02" Mar 18 15:49:26 crc kubenswrapper[4696]: I0318 15:49:26.226016 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"610fde0743aeeb8b2d4e9cbf5ea44cd0e363f40642db8f986a1b778c8f65df02"} err="failed to get container status \"610fde0743aeeb8b2d4e9cbf5ea44cd0e363f40642db8f986a1b778c8f65df02\": rpc error: code = NotFound desc = could not find container \"610fde0743aeeb8b2d4e9cbf5ea44cd0e363f40642db8f986a1b778c8f65df02\": container with ID starting with 610fde0743aeeb8b2d4e9cbf5ea44cd0e363f40642db8f986a1b778c8f65df02 not found: ID does not exist" Mar 18 15:49:26 crc kubenswrapper[4696]: I0318 15:49:26.226036 4696 scope.go:117] "RemoveContainer" containerID="f62d778e04045c3984d72e3242e4af66231f0825a7c7cc2a05a0e946c2b273f9" Mar 18 15:49:26 crc kubenswrapper[4696]: E0318 15:49:26.226321 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f62d778e04045c3984d72e3242e4af66231f0825a7c7cc2a05a0e946c2b273f9\": container with ID starting with f62d778e04045c3984d72e3242e4af66231f0825a7c7cc2a05a0e946c2b273f9 not found: ID does not exist" containerID="f62d778e04045c3984d72e3242e4af66231f0825a7c7cc2a05a0e946c2b273f9" Mar 18 15:49:26 crc kubenswrapper[4696]: I0318 15:49:26.226352 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f62d778e04045c3984d72e3242e4af66231f0825a7c7cc2a05a0e946c2b273f9"} err="failed to get container status \"f62d778e04045c3984d72e3242e4af66231f0825a7c7cc2a05a0e946c2b273f9\": rpc error: code = NotFound desc = could not find container \"f62d778e04045c3984d72e3242e4af66231f0825a7c7cc2a05a0e946c2b273f9\": container with ID starting with f62d778e04045c3984d72e3242e4af66231f0825a7c7cc2a05a0e946c2b273f9 not found: ID does not exist" Mar 18 15:49:27 crc kubenswrapper[4696]: I0318 15:49:27.607361 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f" path="/var/lib/kubelet/pods/9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f/volumes" Mar 18 15:49:42 crc kubenswrapper[4696]: I0318 15:49:42.185138 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:49:42 crc kubenswrapper[4696]: I0318 15:49:42.185991 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:49:42 crc kubenswrapper[4696]: I0318 15:49:42.186051 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 15:49:42 crc kubenswrapper[4696]: I0318 15:49:42.186760 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1b4bf50aa0e21fc64691951c3d86757dea376b50f966710c0ef2ad26f0257226"} pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 15:49:42 crc kubenswrapper[4696]: I0318 15:49:42.186831 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" containerID="cri-o://1b4bf50aa0e21fc64691951c3d86757dea376b50f966710c0ef2ad26f0257226" gracePeriod=600 Mar 18 15:49:43 crc kubenswrapper[4696]: I0318 15:49:43.273386 4696 generic.go:334] "Generic (PLEG): container finished" podID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerID="1b4bf50aa0e21fc64691951c3d86757dea376b50f966710c0ef2ad26f0257226" exitCode=0 Mar 18 15:49:43 crc kubenswrapper[4696]: I0318 15:49:43.273813 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerDied","Data":"1b4bf50aa0e21fc64691951c3d86757dea376b50f966710c0ef2ad26f0257226"} Mar 18 15:49:43 crc kubenswrapper[4696]: I0318 15:49:43.275624 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerStarted","Data":"8dbceebb5bec41c37e5bdb742818ff7d79c094f0f97795f1ea326504fa9fafa5"} Mar 18 15:49:43 crc kubenswrapper[4696]: I0318 15:49:43.275775 4696 scope.go:117] "RemoveContainer" containerID="ea4660abb995b59b3411f3109e892dc576f9d2448c545e6fbdbaeccd931d4b49" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.253858 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-wmgb2"] Mar 18 15:49:49 crc kubenswrapper[4696]: E0318 15:49:49.255169 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f" containerName="extract-content" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.255193 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f" containerName="extract-content" Mar 18 15:49:49 crc kubenswrapper[4696]: E0318 15:49:49.255219 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f" containerName="extract-utilities" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.255228 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f" containerName="extract-utilities" Mar 18 15:49:49 crc kubenswrapper[4696]: E0318 15:49:49.255246 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f" containerName="registry-server" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.255256 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f" containerName="registry-server" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.255395 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bafc3d7-fa7a-48e8-a88b-93bdcf1c851f" containerName="registry-server" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.256026 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-wmgb2" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.258620 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.258892 4696 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-pqdzk" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.260254 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.260496 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-lhsfv"] Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.261574 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-lhsfv" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.263370 4696 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-8v7c2" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.275974 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-wmgb2"] Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.283424 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-lhsfv"] Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.295076 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-7ncrl"] Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.296207 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-7ncrl" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.299511 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56sl8\" (UniqueName: \"kubernetes.io/projected/e65694c1-c09d-4ab3-8032-640197b84e20-kube-api-access-56sl8\") pod \"cert-manager-858654f9db-lhsfv\" (UID: \"e65694c1-c09d-4ab3-8032-640197b84e20\") " pod="cert-manager/cert-manager-858654f9db-lhsfv" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.299593 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp7pz\" (UniqueName: \"kubernetes.io/projected/e4aa5b80-28c0-4b73-91ef-a0b1325d7823-kube-api-access-mp7pz\") pod \"cert-manager-cainjector-cf98fcc89-wmgb2\" (UID: \"e4aa5b80-28c0-4b73-91ef-a0b1325d7823\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-wmgb2" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.300172 4696 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-vdzm2" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.315496 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-7ncrl"] Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.401037 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56sl8\" (UniqueName: \"kubernetes.io/projected/e65694c1-c09d-4ab3-8032-640197b84e20-kube-api-access-56sl8\") pod \"cert-manager-858654f9db-lhsfv\" (UID: \"e65694c1-c09d-4ab3-8032-640197b84e20\") " pod="cert-manager/cert-manager-858654f9db-lhsfv" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.401731 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp7pz\" (UniqueName: \"kubernetes.io/projected/e4aa5b80-28c0-4b73-91ef-a0b1325d7823-kube-api-access-mp7pz\") pod \"cert-manager-cainjector-cf98fcc89-wmgb2\" (UID: \"e4aa5b80-28c0-4b73-91ef-a0b1325d7823\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-wmgb2" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.401828 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl56z\" (UniqueName: \"kubernetes.io/projected/ed08594c-854d-4c4d-8390-025916809f21-kube-api-access-wl56z\") pod \"cert-manager-webhook-687f57d79b-7ncrl\" (UID: \"ed08594c-854d-4c4d-8390-025916809f21\") " pod="cert-manager/cert-manager-webhook-687f57d79b-7ncrl" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.423584 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp7pz\" (UniqueName: \"kubernetes.io/projected/e4aa5b80-28c0-4b73-91ef-a0b1325d7823-kube-api-access-mp7pz\") pod \"cert-manager-cainjector-cf98fcc89-wmgb2\" (UID: \"e4aa5b80-28c0-4b73-91ef-a0b1325d7823\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-wmgb2" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.423595 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56sl8\" (UniqueName: \"kubernetes.io/projected/e65694c1-c09d-4ab3-8032-640197b84e20-kube-api-access-56sl8\") pod \"cert-manager-858654f9db-lhsfv\" (UID: \"e65694c1-c09d-4ab3-8032-640197b84e20\") " pod="cert-manager/cert-manager-858654f9db-lhsfv" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.503672 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl56z\" (UniqueName: \"kubernetes.io/projected/ed08594c-854d-4c4d-8390-025916809f21-kube-api-access-wl56z\") pod \"cert-manager-webhook-687f57d79b-7ncrl\" (UID: \"ed08594c-854d-4c4d-8390-025916809f21\") " pod="cert-manager/cert-manager-webhook-687f57d79b-7ncrl" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.522252 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl56z\" (UniqueName: \"kubernetes.io/projected/ed08594c-854d-4c4d-8390-025916809f21-kube-api-access-wl56z\") pod \"cert-manager-webhook-687f57d79b-7ncrl\" (UID: \"ed08594c-854d-4c4d-8390-025916809f21\") " pod="cert-manager/cert-manager-webhook-687f57d79b-7ncrl" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.584258 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-wmgb2" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.597547 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-lhsfv" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.616074 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-7ncrl" Mar 18 15:49:49 crc kubenswrapper[4696]: I0318 15:49:49.856680 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-wmgb2"] Mar 18 15:49:50 crc kubenswrapper[4696]: I0318 15:49:50.173388 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-lhsfv"] Mar 18 15:49:50 crc kubenswrapper[4696]: W0318 15:49:50.179798 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode65694c1_c09d_4ab3_8032_640197b84e20.slice/crio-4f5d0de62712230e368058e2afed44e1778d8774271167acf2b72d51e06c2b2f WatchSource:0}: Error finding container 4f5d0de62712230e368058e2afed44e1778d8774271167acf2b72d51e06c2b2f: Status 404 returned error can't find the container with id 4f5d0de62712230e368058e2afed44e1778d8774271167acf2b72d51e06c2b2f Mar 18 15:49:50 crc kubenswrapper[4696]: I0318 15:49:50.180238 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-7ncrl"] Mar 18 15:49:50 crc kubenswrapper[4696]: W0318 15:49:50.182140 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded08594c_854d_4c4d_8390_025916809f21.slice/crio-508255fdf138f4a29c88915b2936af199b3183af9d6f1fcbf7e0fda1c9b5ba0a WatchSource:0}: Error finding container 508255fdf138f4a29c88915b2936af199b3183af9d6f1fcbf7e0fda1c9b5ba0a: Status 404 returned error can't find the container with id 508255fdf138f4a29c88915b2936af199b3183af9d6f1fcbf7e0fda1c9b5ba0a Mar 18 15:49:50 crc kubenswrapper[4696]: I0318 15:49:50.331861 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-wmgb2" event={"ID":"e4aa5b80-28c0-4b73-91ef-a0b1325d7823","Type":"ContainerStarted","Data":"655a460179b57590a51c35837123a3182c7231cde20bd596773c7fad281c0fb4"} Mar 18 15:49:50 crc kubenswrapper[4696]: I0318 15:49:50.333463 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-7ncrl" event={"ID":"ed08594c-854d-4c4d-8390-025916809f21","Type":"ContainerStarted","Data":"508255fdf138f4a29c88915b2936af199b3183af9d6f1fcbf7e0fda1c9b5ba0a"} Mar 18 15:49:50 crc kubenswrapper[4696]: I0318 15:49:50.334404 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-lhsfv" event={"ID":"e65694c1-c09d-4ab3-8032-640197b84e20","Type":"ContainerStarted","Data":"4f5d0de62712230e368058e2afed44e1778d8774271167acf2b72d51e06c2b2f"} Mar 18 15:49:55 crc kubenswrapper[4696]: I0318 15:49:55.370679 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-lhsfv" event={"ID":"e65694c1-c09d-4ab3-8032-640197b84e20","Type":"ContainerStarted","Data":"c65e24f84d82007c0d38293e84c288a8bc39a1f8f6c773a9e776caccbd7e6974"} Mar 18 15:49:55 crc kubenswrapper[4696]: I0318 15:49:55.372118 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-wmgb2" event={"ID":"e4aa5b80-28c0-4b73-91ef-a0b1325d7823","Type":"ContainerStarted","Data":"d1680cb39aafff0306b53ad3f621b37e4a9c5a3b40ff96345fbb1a07d03f4dbb"} Mar 18 15:49:55 crc kubenswrapper[4696]: I0318 15:49:55.373369 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-7ncrl" event={"ID":"ed08594c-854d-4c4d-8390-025916809f21","Type":"ContainerStarted","Data":"e0226aa6c073fc7196d0d96243a12fab6acb2fbfbd15183ff0e42f8d00e44134"} Mar 18 15:49:55 crc kubenswrapper[4696]: I0318 15:49:55.373501 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-7ncrl" Mar 18 15:49:55 crc kubenswrapper[4696]: I0318 15:49:55.389866 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-lhsfv" podStartSLOduration=1.5734629660000001 podStartE2EDuration="6.38984594s" podCreationTimestamp="2026-03-18 15:49:49 +0000 UTC" firstStartedPulling="2026-03-18 15:49:50.183407558 +0000 UTC m=+833.189581764" lastFinishedPulling="2026-03-18 15:49:54.999790522 +0000 UTC m=+838.005964738" observedRunningTime="2026-03-18 15:49:55.3874846 +0000 UTC m=+838.393658816" watchObservedRunningTime="2026-03-18 15:49:55.38984594 +0000 UTC m=+838.396020146" Mar 18 15:49:55 crc kubenswrapper[4696]: I0318 15:49:55.438028 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-7ncrl" podStartSLOduration=1.628649757 podStartE2EDuration="6.437985044s" podCreationTimestamp="2026-03-18 15:49:49 +0000 UTC" firstStartedPulling="2026-03-18 15:49:50.191982304 +0000 UTC m=+833.198156510" lastFinishedPulling="2026-03-18 15:49:55.001317591 +0000 UTC m=+838.007491797" observedRunningTime="2026-03-18 15:49:55.414259286 +0000 UTC m=+838.420433492" watchObservedRunningTime="2026-03-18 15:49:55.437985044 +0000 UTC m=+838.444159250" Mar 18 15:49:55 crc kubenswrapper[4696]: I0318 15:49:55.439642 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-wmgb2" podStartSLOduration=1.241872932 podStartE2EDuration="6.439632906s" podCreationTimestamp="2026-03-18 15:49:49 +0000 UTC" firstStartedPulling="2026-03-18 15:49:49.869058909 +0000 UTC m=+832.875233105" lastFinishedPulling="2026-03-18 15:49:55.066818873 +0000 UTC m=+838.072993079" observedRunningTime="2026-03-18 15:49:55.436943718 +0000 UTC m=+838.443117924" watchObservedRunningTime="2026-03-18 15:49:55.439632906 +0000 UTC m=+838.445807112" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.226075 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lqxgs"] Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.228886 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="ovn-controller" containerID="cri-o://e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629" gracePeriod=30 Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.229018 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="northd" containerID="cri-o://465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272" gracePeriod=30 Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.228998 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="sbdb" containerID="cri-o://e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2" gracePeriod=30 Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.229108 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="kube-rbac-proxy-node" containerID="cri-o://275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc" gracePeriod=30 Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.229108 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38" gracePeriod=30 Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.229108 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="ovn-acl-logging" containerID="cri-o://3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e" gracePeriod=30 Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.228963 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="nbdb" containerID="cri-o://9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa" gracePeriod=30 Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.271832 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="ovnkube-controller" containerID="cri-o://830374b73fa72ec6e84f666fa55509e4a61825915bc5f9ffec92a5bf4743e9cf" gracePeriod=30 Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.408063 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lqxgs_1dc15d44-2b63-40b8-b9c8-dad533d01710/ovnkube-controller/2.log" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.410566 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lqxgs_1dc15d44-2b63-40b8-b9c8-dad533d01710/ovn-acl-logging/0.log" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.411163 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lqxgs_1dc15d44-2b63-40b8-b9c8-dad533d01710/ovn-controller/0.log" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.411715 4696 generic.go:334] "Generic (PLEG): container finished" podID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerID="830374b73fa72ec6e84f666fa55509e4a61825915bc5f9ffec92a5bf4743e9cf" exitCode=0 Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.411818 4696 generic.go:334] "Generic (PLEG): container finished" podID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerID="e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38" exitCode=0 Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.411828 4696 generic.go:334] "Generic (PLEG): container finished" podID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerID="275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc" exitCode=0 Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.411838 4696 generic.go:334] "Generic (PLEG): container finished" podID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerID="3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e" exitCode=143 Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.411846 4696 generic.go:334] "Generic (PLEG): container finished" podID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerID="e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629" exitCode=143 Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.411773 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" event={"ID":"1dc15d44-2b63-40b8-b9c8-dad533d01710","Type":"ContainerDied","Data":"830374b73fa72ec6e84f666fa55509e4a61825915bc5f9ffec92a5bf4743e9cf"} Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.412058 4696 scope.go:117] "RemoveContainer" containerID="b0ed1a5792af0b624c478617b8f24a47b9144567df3a6e799394969b248847c5" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.412028 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" event={"ID":"1dc15d44-2b63-40b8-b9c8-dad533d01710","Type":"ContainerDied","Data":"e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38"} Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.412169 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" event={"ID":"1dc15d44-2b63-40b8-b9c8-dad533d01710","Type":"ContainerDied","Data":"275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc"} Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.412188 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" event={"ID":"1dc15d44-2b63-40b8-b9c8-dad533d01710","Type":"ContainerDied","Data":"3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e"} Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.412206 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" event={"ID":"1dc15d44-2b63-40b8-b9c8-dad533d01710","Type":"ContainerDied","Data":"e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629"} Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.413997 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w9dbn_49424478-cad5-4788-b01e-4ebde47480e1/kube-multus/1.log" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.415025 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w9dbn_49424478-cad5-4788-b01e-4ebde47480e1/kube-multus/0.log" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.415086 4696 generic.go:334] "Generic (PLEG): container finished" podID="49424478-cad5-4788-b01e-4ebde47480e1" containerID="fa794b83238daf1378a5166ca37e2d751e1a3333d97dd591aa631c9bf3e539b5" exitCode=2 Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.415131 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w9dbn" event={"ID":"49424478-cad5-4788-b01e-4ebde47480e1","Type":"ContainerDied","Data":"fa794b83238daf1378a5166ca37e2d751e1a3333d97dd591aa631c9bf3e539b5"} Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.415947 4696 scope.go:117] "RemoveContainer" containerID="fa794b83238daf1378a5166ca37e2d751e1a3333d97dd591aa631c9bf3e539b5" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.485629 4696 scope.go:117] "RemoveContainer" containerID="684c928a47d3ff8cbb52c9dbfc6282bbca13cf0adb73e03c408098ffc39b8049" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.606345 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lqxgs_1dc15d44-2b63-40b8-b9c8-dad533d01710/ovn-acl-logging/0.log" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.607389 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lqxgs_1dc15d44-2b63-40b8-b9c8-dad533d01710/ovn-controller/0.log" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.607981 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.670117 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-574tr"] Mar 18 15:49:59 crc kubenswrapper[4696]: E0318 15:49:59.670990 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="ovnkube-controller" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.671043 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="ovnkube-controller" Mar 18 15:49:59 crc kubenswrapper[4696]: E0318 15:49:59.671056 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="kubecfg-setup" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.671111 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="kubecfg-setup" Mar 18 15:49:59 crc kubenswrapper[4696]: E0318 15:49:59.671166 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="nbdb" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.671179 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="nbdb" Mar 18 15:49:59 crc kubenswrapper[4696]: E0318 15:49:59.671219 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="kube-rbac-proxy-node" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.671262 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="kube-rbac-proxy-node" Mar 18 15:49:59 crc kubenswrapper[4696]: E0318 15:49:59.671271 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="ovn-controller" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.671277 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="ovn-controller" Mar 18 15:49:59 crc kubenswrapper[4696]: E0318 15:49:59.671329 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="ovnkube-controller" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.671341 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="ovnkube-controller" Mar 18 15:49:59 crc kubenswrapper[4696]: E0318 15:49:59.671356 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.671364 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 15:49:59 crc kubenswrapper[4696]: E0318 15:49:59.671374 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="sbdb" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.671380 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="sbdb" Mar 18 15:49:59 crc kubenswrapper[4696]: E0318 15:49:59.671409 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="northd" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.671416 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="northd" Mar 18 15:49:59 crc kubenswrapper[4696]: E0318 15:49:59.671431 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="ovn-acl-logging" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.671438 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="ovn-acl-logging" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.671642 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="kube-rbac-proxy-node" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.671654 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="ovn-acl-logging" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.671662 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="sbdb" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.671672 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.671700 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="ovnkube-controller" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.671708 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="ovn-controller" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.671717 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="northd" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.671725 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="ovnkube-controller" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.671734 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="nbdb" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.671744 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="ovnkube-controller" Mar 18 15:49:59 crc kubenswrapper[4696]: E0318 15:49:59.671963 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="ovnkube-controller" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.671977 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="ovnkube-controller" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.672193 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="ovnkube-controller" Mar 18 15:49:59 crc kubenswrapper[4696]: E0318 15:49:59.672381 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="ovnkube-controller" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.672401 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerName="ovnkube-controller" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.674877 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706056 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-node-log\") pod \"1dc15d44-2b63-40b8-b9c8-dad533d01710\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706131 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1dc15d44-2b63-40b8-b9c8-dad533d01710-env-overrides\") pod \"1dc15d44-2b63-40b8-b9c8-dad533d01710\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706149 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-cni-bin\") pod \"1dc15d44-2b63-40b8-b9c8-dad533d01710\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706167 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-log-socket\") pod \"1dc15d44-2b63-40b8-b9c8-dad533d01710\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706191 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-run-openvswitch\") pod \"1dc15d44-2b63-40b8-b9c8-dad533d01710\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706244 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vvzl\" (UniqueName: \"kubernetes.io/projected/1dc15d44-2b63-40b8-b9c8-dad533d01710-kube-api-access-9vvzl\") pod \"1dc15d44-2b63-40b8-b9c8-dad533d01710\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706198 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-node-log" (OuterVolumeSpecName: "node-log") pod "1dc15d44-2b63-40b8-b9c8-dad533d01710" (UID: "1dc15d44-2b63-40b8-b9c8-dad533d01710"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706271 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-run-ovn-kubernetes\") pod \"1dc15d44-2b63-40b8-b9c8-dad533d01710\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706286 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-slash\") pod \"1dc15d44-2b63-40b8-b9c8-dad533d01710\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706312 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "1dc15d44-2b63-40b8-b9c8-dad533d01710" (UID: "1dc15d44-2b63-40b8-b9c8-dad533d01710"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706325 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-var-lib-openvswitch\") pod \"1dc15d44-2b63-40b8-b9c8-dad533d01710\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706342 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1dc15d44-2b63-40b8-b9c8-dad533d01710-ovnkube-script-lib\") pod \"1dc15d44-2b63-40b8-b9c8-dad533d01710\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706347 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-log-socket" (OuterVolumeSpecName: "log-socket") pod "1dc15d44-2b63-40b8-b9c8-dad533d01710" (UID: "1dc15d44-2b63-40b8-b9c8-dad533d01710"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706360 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-run-ovn\") pod \"1dc15d44-2b63-40b8-b9c8-dad533d01710\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706352 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "1dc15d44-2b63-40b8-b9c8-dad533d01710" (UID: "1dc15d44-2b63-40b8-b9c8-dad533d01710"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706375 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-slash" (OuterVolumeSpecName: "host-slash") pod "1dc15d44-2b63-40b8-b9c8-dad533d01710" (UID: "1dc15d44-2b63-40b8-b9c8-dad533d01710"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706383 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-kubelet\") pod \"1dc15d44-2b63-40b8-b9c8-dad533d01710\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706445 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "1dc15d44-2b63-40b8-b9c8-dad533d01710" (UID: "1dc15d44-2b63-40b8-b9c8-dad533d01710"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706493 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "1dc15d44-2b63-40b8-b9c8-dad533d01710" (UID: "1dc15d44-2b63-40b8-b9c8-dad533d01710"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706512 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-cni-netd\") pod \"1dc15d44-2b63-40b8-b9c8-dad533d01710\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706557 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-var-lib-cni-networks-ovn-kubernetes\") pod \"1dc15d44-2b63-40b8-b9c8-dad533d01710\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706577 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-run-netns\") pod \"1dc15d44-2b63-40b8-b9c8-dad533d01710\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706621 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1dc15d44-2b63-40b8-b9c8-dad533d01710-ovn-node-metrics-cert\") pod \"1dc15d44-2b63-40b8-b9c8-dad533d01710\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706656 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1dc15d44-2b63-40b8-b9c8-dad533d01710-ovnkube-config\") pod \"1dc15d44-2b63-40b8-b9c8-dad533d01710\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706683 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-etc-openvswitch\") pod \"1dc15d44-2b63-40b8-b9c8-dad533d01710\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706705 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-systemd-units\") pod \"1dc15d44-2b63-40b8-b9c8-dad533d01710\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706771 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-run-systemd\") pod \"1dc15d44-2b63-40b8-b9c8-dad533d01710\" (UID: \"1dc15d44-2b63-40b8-b9c8-dad533d01710\") " Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706875 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "1dc15d44-2b63-40b8-b9c8-dad533d01710" (UID: "1dc15d44-2b63-40b8-b9c8-dad533d01710"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706931 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dc15d44-2b63-40b8-b9c8-dad533d01710-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "1dc15d44-2b63-40b8-b9c8-dad533d01710" (UID: "1dc15d44-2b63-40b8-b9c8-dad533d01710"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706959 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "1dc15d44-2b63-40b8-b9c8-dad533d01710" (UID: "1dc15d44-2b63-40b8-b9c8-dad533d01710"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.706982 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "1dc15d44-2b63-40b8-b9c8-dad533d01710" (UID: "1dc15d44-2b63-40b8-b9c8-dad533d01710"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.707002 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "1dc15d44-2b63-40b8-b9c8-dad533d01710" (UID: "1dc15d44-2b63-40b8-b9c8-dad533d01710"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.707021 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "1dc15d44-2b63-40b8-b9c8-dad533d01710" (UID: "1dc15d44-2b63-40b8-b9c8-dad533d01710"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.707039 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "1dc15d44-2b63-40b8-b9c8-dad533d01710" (UID: "1dc15d44-2b63-40b8-b9c8-dad533d01710"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.707066 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "1dc15d44-2b63-40b8-b9c8-dad533d01710" (UID: "1dc15d44-2b63-40b8-b9c8-dad533d01710"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.707198 4696 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.707211 4696 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-slash\") on node \"crc\" DevicePath \"\"" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.707221 4696 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.707230 4696 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1dc15d44-2b63-40b8-b9c8-dad533d01710-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.707239 4696 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.707247 4696 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.707256 4696 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.707267 4696 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.707276 4696 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.707285 4696 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.707293 4696 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.707301 4696 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-node-log\") on node \"crc\" DevicePath \"\"" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.707309 4696 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.707318 4696 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-log-socket\") on node \"crc\" DevicePath \"\"" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.707326 4696 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.707829 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dc15d44-2b63-40b8-b9c8-dad533d01710-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "1dc15d44-2b63-40b8-b9c8-dad533d01710" (UID: "1dc15d44-2b63-40b8-b9c8-dad533d01710"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.708120 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dc15d44-2b63-40b8-b9c8-dad533d01710-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "1dc15d44-2b63-40b8-b9c8-dad533d01710" (UID: "1dc15d44-2b63-40b8-b9c8-dad533d01710"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.713876 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dc15d44-2b63-40b8-b9c8-dad533d01710-kube-api-access-9vvzl" (OuterVolumeSpecName: "kube-api-access-9vvzl") pod "1dc15d44-2b63-40b8-b9c8-dad533d01710" (UID: "1dc15d44-2b63-40b8-b9c8-dad533d01710"). InnerVolumeSpecName "kube-api-access-9vvzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.714554 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dc15d44-2b63-40b8-b9c8-dad533d01710-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "1dc15d44-2b63-40b8-b9c8-dad533d01710" (UID: "1dc15d44-2b63-40b8-b9c8-dad533d01710"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.721469 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "1dc15d44-2b63-40b8-b9c8-dad533d01710" (UID: "1dc15d44-2b63-40b8-b9c8-dad533d01710"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.808581 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-var-lib-openvswitch\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.808649 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/98898d1e-88a8-4628-9bc0-cb4e7e609cea-env-overrides\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.808738 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-run-systemd\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.808874 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-run-ovn\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.808893 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-run-openvswitch\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.808913 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-systemd-units\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.808931 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-log-socket\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.808946 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/98898d1e-88a8-4628-9bc0-cb4e7e609cea-ovn-node-metrics-cert\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.808965 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/98898d1e-88a8-4628-9bc0-cb4e7e609cea-ovnkube-script-lib\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.809223 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-host-run-netns\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.809300 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-host-kubelet\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.809370 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.809429 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-host-run-ovn-kubernetes\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.809461 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-host-slash\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.809488 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-etc-openvswitch\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.809550 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-host-cni-bin\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.809587 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfpv7\" (UniqueName: \"kubernetes.io/projected/98898d1e-88a8-4628-9bc0-cb4e7e609cea-kube-api-access-vfpv7\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.809640 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-host-cni-netd\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.809679 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/98898d1e-88a8-4628-9bc0-cb4e7e609cea-ovnkube-config\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.809723 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-node-log\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.809846 4696 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1dc15d44-2b63-40b8-b9c8-dad533d01710-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.809861 4696 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1dc15d44-2b63-40b8-b9c8-dad533d01710-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.809959 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vvzl\" (UniqueName: \"kubernetes.io/projected/1dc15d44-2b63-40b8-b9c8-dad533d01710-kube-api-access-9vvzl\") on node \"crc\" DevicePath \"\"" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.810033 4696 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1dc15d44-2b63-40b8-b9c8-dad533d01710-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.810046 4696 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1dc15d44-2b63-40b8-b9c8-dad533d01710-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911192 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-systemd-units\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911259 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-log-socket\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911282 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/98898d1e-88a8-4628-9bc0-cb4e7e609cea-ovn-node-metrics-cert\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911330 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/98898d1e-88a8-4628-9bc0-cb4e7e609cea-ovnkube-script-lib\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911372 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-host-kubelet\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911393 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-host-run-netns\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911421 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911449 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-host-kubelet\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911483 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-host-run-ovn-kubernetes\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911338 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-systemd-units\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911454 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-host-run-ovn-kubernetes\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911483 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911338 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-log-socket\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911547 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-host-slash\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911454 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-host-run-netns\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911573 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-etc-openvswitch\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911588 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-host-slash\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911596 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-host-cni-bin\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911617 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfpv7\" (UniqueName: \"kubernetes.io/projected/98898d1e-88a8-4628-9bc0-cb4e7e609cea-kube-api-access-vfpv7\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911621 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-etc-openvswitch\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911642 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-host-cni-netd\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911654 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-host-cni-bin\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911665 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/98898d1e-88a8-4628-9bc0-cb4e7e609cea-ovnkube-config\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911691 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-node-log\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911721 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-var-lib-openvswitch\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911725 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-host-cni-netd\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911744 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-run-systemd\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911770 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/98898d1e-88a8-4628-9bc0-cb4e7e609cea-env-overrides\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911801 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-run-ovn\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911817 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-run-openvswitch\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911875 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-run-openvswitch\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911908 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-var-lib-openvswitch\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911935 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-run-systemd\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.911774 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-node-log\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.912156 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/98898d1e-88a8-4628-9bc0-cb4e7e609cea-run-ovn\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.912312 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/98898d1e-88a8-4628-9bc0-cb4e7e609cea-ovnkube-script-lib\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.912428 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/98898d1e-88a8-4628-9bc0-cb4e7e609cea-env-overrides\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.912904 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/98898d1e-88a8-4628-9bc0-cb4e7e609cea-ovnkube-config\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.915327 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/98898d1e-88a8-4628-9bc0-cb4e7e609cea-ovn-node-metrics-cert\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.929060 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfpv7\" (UniqueName: \"kubernetes.io/projected/98898d1e-88a8-4628-9bc0-cb4e7e609cea-kube-api-access-vfpv7\") pod \"ovnkube-node-574tr\" (UID: \"98898d1e-88a8-4628-9bc0-cb4e7e609cea\") " pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:49:59 crc kubenswrapper[4696]: I0318 15:49:59.990652 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.132628 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564150-lks4l"] Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.134627 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564150-lks4l" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.136784 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.138301 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.138460 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.216015 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5ncq\" (UniqueName: \"kubernetes.io/projected/4b6ad1a0-39d2-4efa-b976-6879406394d3-kube-api-access-q5ncq\") pod \"auto-csr-approver-29564150-lks4l\" (UID: \"4b6ad1a0-39d2-4efa-b976-6879406394d3\") " pod="openshift-infra/auto-csr-approver-29564150-lks4l" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.317117 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5ncq\" (UniqueName: \"kubernetes.io/projected/4b6ad1a0-39d2-4efa-b976-6879406394d3-kube-api-access-q5ncq\") pod \"auto-csr-approver-29564150-lks4l\" (UID: \"4b6ad1a0-39d2-4efa-b976-6879406394d3\") " pod="openshift-infra/auto-csr-approver-29564150-lks4l" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.334425 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5ncq\" (UniqueName: \"kubernetes.io/projected/4b6ad1a0-39d2-4efa-b976-6879406394d3-kube-api-access-q5ncq\") pod \"auto-csr-approver-29564150-lks4l\" (UID: \"4b6ad1a0-39d2-4efa-b976-6879406394d3\") " pod="openshift-infra/auto-csr-approver-29564150-lks4l" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.426211 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lqxgs_1dc15d44-2b63-40b8-b9c8-dad533d01710/ovn-acl-logging/0.log" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.428547 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lqxgs_1dc15d44-2b63-40b8-b9c8-dad533d01710/ovn-controller/0.log" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.429032 4696 generic.go:334] "Generic (PLEG): container finished" podID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerID="e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2" exitCode=0 Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.429062 4696 generic.go:334] "Generic (PLEG): container finished" podID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerID="9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa" exitCode=0 Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.429075 4696 generic.go:334] "Generic (PLEG): container finished" podID="1dc15d44-2b63-40b8-b9c8-dad533d01710" containerID="465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272" exitCode=0 Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.429093 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" event={"ID":"1dc15d44-2b63-40b8-b9c8-dad533d01710","Type":"ContainerDied","Data":"e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2"} Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.429146 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" event={"ID":"1dc15d44-2b63-40b8-b9c8-dad533d01710","Type":"ContainerDied","Data":"9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa"} Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.429160 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" event={"ID":"1dc15d44-2b63-40b8-b9c8-dad533d01710","Type":"ContainerDied","Data":"465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272"} Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.429171 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" event={"ID":"1dc15d44-2b63-40b8-b9c8-dad533d01710","Type":"ContainerDied","Data":"b7d855baf66edc8258ad5042e989a341246a8d46a831759090946f8f742fdfd9"} Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.429194 4696 scope.go:117] "RemoveContainer" containerID="830374b73fa72ec6e84f666fa55509e4a61825915bc5f9ffec92a5bf4743e9cf" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.429246 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lqxgs" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.431071 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w9dbn_49424478-cad5-4788-b01e-4ebde47480e1/kube-multus/1.log" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.431140 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w9dbn" event={"ID":"49424478-cad5-4788-b01e-4ebde47480e1","Type":"ContainerStarted","Data":"b2c3d16afe571258a6823e0a9c68e5f9052d320347fc7f902cc20b31c6d4478b"} Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.434251 4696 generic.go:334] "Generic (PLEG): container finished" podID="98898d1e-88a8-4628-9bc0-cb4e7e609cea" containerID="7f73f6275740b5435bb1aa00133a973acf0d38e81a96e88fbb4f65c5b77cac65" exitCode=0 Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.434318 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-574tr" event={"ID":"98898d1e-88a8-4628-9bc0-cb4e7e609cea","Type":"ContainerDied","Data":"7f73f6275740b5435bb1aa00133a973acf0d38e81a96e88fbb4f65c5b77cac65"} Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.434418 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-574tr" event={"ID":"98898d1e-88a8-4628-9bc0-cb4e7e609cea","Type":"ContainerStarted","Data":"86f1faa84739681c51f26f3e3ec5772a987190981499a4d9b36a1f3b07a45087"} Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.450130 4696 scope.go:117] "RemoveContainer" containerID="e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.454055 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564150-lks4l" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.481817 4696 scope.go:117] "RemoveContainer" containerID="9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa" Mar 18 15:50:00 crc kubenswrapper[4696]: E0318 15:50:00.514776 4696 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29564150-lks4l_openshift-infra_4b6ad1a0-39d2-4efa-b976-6879406394d3_0(307a13e25aaecad225924429c2cad23d0e9542b8cad9421267d9d1b98e8aacfd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:50:00 crc kubenswrapper[4696]: E0318 15:50:00.514918 4696 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29564150-lks4l_openshift-infra_4b6ad1a0-39d2-4efa-b976-6879406394d3_0(307a13e25aaecad225924429c2cad23d0e9542b8cad9421267d9d1b98e8aacfd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29564150-lks4l" Mar 18 15:50:00 crc kubenswrapper[4696]: E0318 15:50:00.514951 4696 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29564150-lks4l_openshift-infra_4b6ad1a0-39d2-4efa-b976-6879406394d3_0(307a13e25aaecad225924429c2cad23d0e9542b8cad9421267d9d1b98e8aacfd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29564150-lks4l" Mar 18 15:50:00 crc kubenswrapper[4696]: E0318 15:50:00.515031 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29564150-lks4l_openshift-infra(4b6ad1a0-39d2-4efa-b976-6879406394d3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29564150-lks4l_openshift-infra(4b6ad1a0-39d2-4efa-b976-6879406394d3)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29564150-lks4l_openshift-infra_4b6ad1a0-39d2-4efa-b976-6879406394d3_0(307a13e25aaecad225924429c2cad23d0e9542b8cad9421267d9d1b98e8aacfd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29564150-lks4l" podUID="4b6ad1a0-39d2-4efa-b976-6879406394d3" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.515448 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lqxgs"] Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.521217 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lqxgs"] Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.523427 4696 scope.go:117] "RemoveContainer" containerID="465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.539952 4696 scope.go:117] "RemoveContainer" containerID="e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.563892 4696 scope.go:117] "RemoveContainer" containerID="275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.583026 4696 scope.go:117] "RemoveContainer" containerID="3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.607099 4696 scope.go:117] "RemoveContainer" containerID="e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.635367 4696 scope.go:117] "RemoveContainer" containerID="731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.656643 4696 scope.go:117] "RemoveContainer" containerID="830374b73fa72ec6e84f666fa55509e4a61825915bc5f9ffec92a5bf4743e9cf" Mar 18 15:50:00 crc kubenswrapper[4696]: E0318 15:50:00.657304 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"830374b73fa72ec6e84f666fa55509e4a61825915bc5f9ffec92a5bf4743e9cf\": container with ID starting with 830374b73fa72ec6e84f666fa55509e4a61825915bc5f9ffec92a5bf4743e9cf not found: ID does not exist" containerID="830374b73fa72ec6e84f666fa55509e4a61825915bc5f9ffec92a5bf4743e9cf" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.657354 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"830374b73fa72ec6e84f666fa55509e4a61825915bc5f9ffec92a5bf4743e9cf"} err="failed to get container status \"830374b73fa72ec6e84f666fa55509e4a61825915bc5f9ffec92a5bf4743e9cf\": rpc error: code = NotFound desc = could not find container \"830374b73fa72ec6e84f666fa55509e4a61825915bc5f9ffec92a5bf4743e9cf\": container with ID starting with 830374b73fa72ec6e84f666fa55509e4a61825915bc5f9ffec92a5bf4743e9cf not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.657388 4696 scope.go:117] "RemoveContainer" containerID="e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2" Mar 18 15:50:00 crc kubenswrapper[4696]: E0318 15:50:00.657670 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2\": container with ID starting with e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2 not found: ID does not exist" containerID="e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.657694 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2"} err="failed to get container status \"e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2\": rpc error: code = NotFound desc = could not find container \"e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2\": container with ID starting with e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2 not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.657710 4696 scope.go:117] "RemoveContainer" containerID="9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa" Mar 18 15:50:00 crc kubenswrapper[4696]: E0318 15:50:00.658633 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa\": container with ID starting with 9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa not found: ID does not exist" containerID="9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.658689 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa"} err="failed to get container status \"9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa\": rpc error: code = NotFound desc = could not find container \"9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa\": container with ID starting with 9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.658718 4696 scope.go:117] "RemoveContainer" containerID="465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272" Mar 18 15:50:00 crc kubenswrapper[4696]: E0318 15:50:00.659198 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272\": container with ID starting with 465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272 not found: ID does not exist" containerID="465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.659230 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272"} err="failed to get container status \"465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272\": rpc error: code = NotFound desc = could not find container \"465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272\": container with ID starting with 465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272 not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.659245 4696 scope.go:117] "RemoveContainer" containerID="e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38" Mar 18 15:50:00 crc kubenswrapper[4696]: E0318 15:50:00.659470 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38\": container with ID starting with e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38 not found: ID does not exist" containerID="e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.659491 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38"} err="failed to get container status \"e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38\": rpc error: code = NotFound desc = could not find container \"e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38\": container with ID starting with e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38 not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.659504 4696 scope.go:117] "RemoveContainer" containerID="275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc" Mar 18 15:50:00 crc kubenswrapper[4696]: E0318 15:50:00.659772 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc\": container with ID starting with 275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc not found: ID does not exist" containerID="275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.659793 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc"} err="failed to get container status \"275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc\": rpc error: code = NotFound desc = could not find container \"275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc\": container with ID starting with 275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.659805 4696 scope.go:117] "RemoveContainer" containerID="3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e" Mar 18 15:50:00 crc kubenswrapper[4696]: E0318 15:50:00.660092 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e\": container with ID starting with 3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e not found: ID does not exist" containerID="3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.660111 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e"} err="failed to get container status \"3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e\": rpc error: code = NotFound desc = could not find container \"3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e\": container with ID starting with 3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.660129 4696 scope.go:117] "RemoveContainer" containerID="e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629" Mar 18 15:50:00 crc kubenswrapper[4696]: E0318 15:50:00.660710 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629\": container with ID starting with e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629 not found: ID does not exist" containerID="e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.660730 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629"} err="failed to get container status \"e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629\": rpc error: code = NotFound desc = could not find container \"e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629\": container with ID starting with e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629 not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.660743 4696 scope.go:117] "RemoveContainer" containerID="731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854" Mar 18 15:50:00 crc kubenswrapper[4696]: E0318 15:50:00.661108 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\": container with ID starting with 731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854 not found: ID does not exist" containerID="731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.661150 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854"} err="failed to get container status \"731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\": rpc error: code = NotFound desc = could not find container \"731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\": container with ID starting with 731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854 not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.661178 4696 scope.go:117] "RemoveContainer" containerID="830374b73fa72ec6e84f666fa55509e4a61825915bc5f9ffec92a5bf4743e9cf" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.661539 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"830374b73fa72ec6e84f666fa55509e4a61825915bc5f9ffec92a5bf4743e9cf"} err="failed to get container status \"830374b73fa72ec6e84f666fa55509e4a61825915bc5f9ffec92a5bf4743e9cf\": rpc error: code = NotFound desc = could not find container \"830374b73fa72ec6e84f666fa55509e4a61825915bc5f9ffec92a5bf4743e9cf\": container with ID starting with 830374b73fa72ec6e84f666fa55509e4a61825915bc5f9ffec92a5bf4743e9cf not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.661562 4696 scope.go:117] "RemoveContainer" containerID="e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.661844 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2"} err="failed to get container status \"e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2\": rpc error: code = NotFound desc = could not find container \"e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2\": container with ID starting with e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2 not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.661867 4696 scope.go:117] "RemoveContainer" containerID="9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.662146 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa"} err="failed to get container status \"9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa\": rpc error: code = NotFound desc = could not find container \"9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa\": container with ID starting with 9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.662162 4696 scope.go:117] "RemoveContainer" containerID="465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.662454 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272"} err="failed to get container status \"465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272\": rpc error: code = NotFound desc = could not find container \"465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272\": container with ID starting with 465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272 not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.662471 4696 scope.go:117] "RemoveContainer" containerID="e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.662803 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38"} err="failed to get container status \"e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38\": rpc error: code = NotFound desc = could not find container \"e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38\": container with ID starting with e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38 not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.662822 4696 scope.go:117] "RemoveContainer" containerID="275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.663129 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc"} err="failed to get container status \"275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc\": rpc error: code = NotFound desc = could not find container \"275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc\": container with ID starting with 275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.663165 4696 scope.go:117] "RemoveContainer" containerID="3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.663436 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e"} err="failed to get container status \"3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e\": rpc error: code = NotFound desc = could not find container \"3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e\": container with ID starting with 3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.663459 4696 scope.go:117] "RemoveContainer" containerID="e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.663698 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629"} err="failed to get container status \"e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629\": rpc error: code = NotFound desc = could not find container \"e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629\": container with ID starting with e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629 not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.663721 4696 scope.go:117] "RemoveContainer" containerID="731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.663954 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854"} err="failed to get container status \"731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\": rpc error: code = NotFound desc = could not find container \"731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\": container with ID starting with 731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854 not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.663979 4696 scope.go:117] "RemoveContainer" containerID="830374b73fa72ec6e84f666fa55509e4a61825915bc5f9ffec92a5bf4743e9cf" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.664188 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"830374b73fa72ec6e84f666fa55509e4a61825915bc5f9ffec92a5bf4743e9cf"} err="failed to get container status \"830374b73fa72ec6e84f666fa55509e4a61825915bc5f9ffec92a5bf4743e9cf\": rpc error: code = NotFound desc = could not find container \"830374b73fa72ec6e84f666fa55509e4a61825915bc5f9ffec92a5bf4743e9cf\": container with ID starting with 830374b73fa72ec6e84f666fa55509e4a61825915bc5f9ffec92a5bf4743e9cf not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.664221 4696 scope.go:117] "RemoveContainer" containerID="e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.664550 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2"} err="failed to get container status \"e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2\": rpc error: code = NotFound desc = could not find container \"e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2\": container with ID starting with e83c50e41e742f6b714fb0aa8c10c01d601eaa43df81d0ff2ddb080d380497a2 not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.664576 4696 scope.go:117] "RemoveContainer" containerID="9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.664852 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa"} err="failed to get container status \"9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa\": rpc error: code = NotFound desc = could not find container \"9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa\": container with ID starting with 9e13ced1cff8b94ae0308202bf29b8717a5a8f5eab919d99d915f5a1d0de41aa not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.664871 4696 scope.go:117] "RemoveContainer" containerID="465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.665415 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272"} err="failed to get container status \"465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272\": rpc error: code = NotFound desc = could not find container \"465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272\": container with ID starting with 465aadaa59f1ad4c5f0affc344eec59f743c4ff45cf829d597a2f04cc4988272 not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.665448 4696 scope.go:117] "RemoveContainer" containerID="e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.665914 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38"} err="failed to get container status \"e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38\": rpc error: code = NotFound desc = could not find container \"e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38\": container with ID starting with e6a37cefedce88051cb3bcdaf81b50c5b3fd91ec83b165458bcb237f3124fb38 not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.665939 4696 scope.go:117] "RemoveContainer" containerID="275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.666253 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc"} err="failed to get container status \"275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc\": rpc error: code = NotFound desc = could not find container \"275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc\": container with ID starting with 275f623e2759a34fd17d85ba37363bdacad8351f1587b20a3473a6d7d0f605bc not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.666277 4696 scope.go:117] "RemoveContainer" containerID="3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.666569 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e"} err="failed to get container status \"3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e\": rpc error: code = NotFound desc = could not find container \"3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e\": container with ID starting with 3044ea7c449a45658a10a17e03f1738b76ee761b7380d14a4638a192610bb96e not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.666601 4696 scope.go:117] "RemoveContainer" containerID="e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.666978 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629"} err="failed to get container status \"e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629\": rpc error: code = NotFound desc = could not find container \"e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629\": container with ID starting with e106c3f6e83a7f715bc8a6ac4ec68cf755b5530d20339735825ce7116f3eb629 not found: ID does not exist" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.666999 4696 scope.go:117] "RemoveContainer" containerID="731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854" Mar 18 15:50:00 crc kubenswrapper[4696]: I0318 15:50:00.667782 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854"} err="failed to get container status \"731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\": rpc error: code = NotFound desc = could not find container \"731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854\": container with ID starting with 731acfaf4873c262cb62697657c59cff05cb9ca33064faca42a21941b6633854 not found: ID does not exist" Mar 18 15:50:01 crc kubenswrapper[4696]: I0318 15:50:01.441885 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-574tr" event={"ID":"98898d1e-88a8-4628-9bc0-cb4e7e609cea","Type":"ContainerStarted","Data":"150c256a0aa0606dfa1ce9c4d3bc5f2a841b8da26acb2ec4462ce515e79779b3"} Mar 18 15:50:01 crc kubenswrapper[4696]: I0318 15:50:01.442166 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-574tr" event={"ID":"98898d1e-88a8-4628-9bc0-cb4e7e609cea","Type":"ContainerStarted","Data":"94e34c61c3a8947b0587433ad017377229a4e733f5de417777873f4fb1aedf39"} Mar 18 15:50:01 crc kubenswrapper[4696]: I0318 15:50:01.442180 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-574tr" event={"ID":"98898d1e-88a8-4628-9bc0-cb4e7e609cea","Type":"ContainerStarted","Data":"1c52b0f7e4abe34a850a7ffc7393ac455fa71296e7c8892d5cdefa2f4e66633f"} Mar 18 15:50:01 crc kubenswrapper[4696]: I0318 15:50:01.442190 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-574tr" event={"ID":"98898d1e-88a8-4628-9bc0-cb4e7e609cea","Type":"ContainerStarted","Data":"01ac4f257a4b6c39696d08626754f0424a595d7c269f6f66673bb2d9338e561a"} Mar 18 15:50:01 crc kubenswrapper[4696]: I0318 15:50:01.442203 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-574tr" event={"ID":"98898d1e-88a8-4628-9bc0-cb4e7e609cea","Type":"ContainerStarted","Data":"3238df16ad959d94486a7751120da3b640fb4e0a7ff39d4cfd0448828cf4d946"} Mar 18 15:50:01 crc kubenswrapper[4696]: I0318 15:50:01.442213 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-574tr" event={"ID":"98898d1e-88a8-4628-9bc0-cb4e7e609cea","Type":"ContainerStarted","Data":"ff27d2f00e37b42e5cb6fbc4d454849b739f8e5e189f0397f031df5a48f19877"} Mar 18 15:50:01 crc kubenswrapper[4696]: I0318 15:50:01.605010 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dc15d44-2b63-40b8-b9c8-dad533d01710" path="/var/lib/kubelet/pods/1dc15d44-2b63-40b8-b9c8-dad533d01710/volumes" Mar 18 15:50:03 crc kubenswrapper[4696]: I0318 15:50:03.467172 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-574tr" event={"ID":"98898d1e-88a8-4628-9bc0-cb4e7e609cea","Type":"ContainerStarted","Data":"b8baad71b63a011b2accf28d73bb0dc41651b9cfa865434ede6cef72850fbe4b"} Mar 18 15:50:04 crc kubenswrapper[4696]: I0318 15:50:04.620378 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-7ncrl" Mar 18 15:50:06 crc kubenswrapper[4696]: I0318 15:50:06.429009 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564150-lks4l"] Mar 18 15:50:06 crc kubenswrapper[4696]: I0318 15:50:06.429689 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564150-lks4l" Mar 18 15:50:06 crc kubenswrapper[4696]: I0318 15:50:06.430130 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564150-lks4l" Mar 18 15:50:06 crc kubenswrapper[4696]: E0318 15:50:06.457363 4696 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29564150-lks4l_openshift-infra_4b6ad1a0-39d2-4efa-b976-6879406394d3_0(cc5f55868ca0fcb1a5df608a7552f3239226c5f0556f53f3eabd0dde44079d7e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:50:06 crc kubenswrapper[4696]: E0318 15:50:06.457436 4696 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29564150-lks4l_openshift-infra_4b6ad1a0-39d2-4efa-b976-6879406394d3_0(cc5f55868ca0fcb1a5df608a7552f3239226c5f0556f53f3eabd0dde44079d7e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29564150-lks4l" Mar 18 15:50:06 crc kubenswrapper[4696]: E0318 15:50:06.457457 4696 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29564150-lks4l_openshift-infra_4b6ad1a0-39d2-4efa-b976-6879406394d3_0(cc5f55868ca0fcb1a5df608a7552f3239226c5f0556f53f3eabd0dde44079d7e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29564150-lks4l" Mar 18 15:50:06 crc kubenswrapper[4696]: E0318 15:50:06.457497 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29564150-lks4l_openshift-infra(4b6ad1a0-39d2-4efa-b976-6879406394d3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29564150-lks4l_openshift-infra(4b6ad1a0-39d2-4efa-b976-6879406394d3)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29564150-lks4l_openshift-infra_4b6ad1a0-39d2-4efa-b976-6879406394d3_0(cc5f55868ca0fcb1a5df608a7552f3239226c5f0556f53f3eabd0dde44079d7e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29564150-lks4l" podUID="4b6ad1a0-39d2-4efa-b976-6879406394d3" Mar 18 15:50:06 crc kubenswrapper[4696]: I0318 15:50:06.485413 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-574tr" event={"ID":"98898d1e-88a8-4628-9bc0-cb4e7e609cea","Type":"ContainerStarted","Data":"8867c07e01d4289bc4f1be6fe513310a055fc4b7be61b669f7b90f85a35dcee9"} Mar 18 15:50:06 crc kubenswrapper[4696]: I0318 15:50:06.485749 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:50:06 crc kubenswrapper[4696]: I0318 15:50:06.485909 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:50:06 crc kubenswrapper[4696]: I0318 15:50:06.485978 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:50:06 crc kubenswrapper[4696]: I0318 15:50:06.520570 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:50:06 crc kubenswrapper[4696]: I0318 15:50:06.523592 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:50:06 crc kubenswrapper[4696]: I0318 15:50:06.550265 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-574tr" podStartSLOduration=7.55024237 podStartE2EDuration="7.55024237s" podCreationTimestamp="2026-03-18 15:49:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:50:06.520340466 +0000 UTC m=+849.526514682" watchObservedRunningTime="2026-03-18 15:50:06.55024237 +0000 UTC m=+849.556416576" Mar 18 15:50:19 crc kubenswrapper[4696]: I0318 15:50:19.597208 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564150-lks4l" Mar 18 15:50:19 crc kubenswrapper[4696]: I0318 15:50:19.598322 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564150-lks4l" Mar 18 15:50:19 crc kubenswrapper[4696]: I0318 15:50:19.855782 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564150-lks4l"] Mar 18 15:50:20 crc kubenswrapper[4696]: I0318 15:50:20.560455 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564150-lks4l" event={"ID":"4b6ad1a0-39d2-4efa-b976-6879406394d3","Type":"ContainerStarted","Data":"9185129af21fd9af96b97e45ebe77ccecb8550f753fb4cd9b10f6fc6290f321a"} Mar 18 15:50:21 crc kubenswrapper[4696]: I0318 15:50:21.569410 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564150-lks4l" event={"ID":"4b6ad1a0-39d2-4efa-b976-6879406394d3","Type":"ContainerStarted","Data":"67687a9e1a40fad3c084ffa78922a2daf272549d0af7985cbea14e5473be0f9e"} Mar 18 15:50:21 crc kubenswrapper[4696]: I0318 15:50:21.589124 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564150-lks4l" podStartSLOduration=20.271600925 podStartE2EDuration="21.589102456s" podCreationTimestamp="2026-03-18 15:50:00 +0000 UTC" firstStartedPulling="2026-03-18 15:50:19.867187684 +0000 UTC m=+862.873361890" lastFinishedPulling="2026-03-18 15:50:21.184689175 +0000 UTC m=+864.190863421" observedRunningTime="2026-03-18 15:50:21.586063789 +0000 UTC m=+864.592238035" watchObservedRunningTime="2026-03-18 15:50:21.589102456 +0000 UTC m=+864.595276662" Mar 18 15:50:22 crc kubenswrapper[4696]: I0318 15:50:22.577451 4696 generic.go:334] "Generic (PLEG): container finished" podID="4b6ad1a0-39d2-4efa-b976-6879406394d3" containerID="67687a9e1a40fad3c084ffa78922a2daf272549d0af7985cbea14e5473be0f9e" exitCode=0 Mar 18 15:50:22 crc kubenswrapper[4696]: I0318 15:50:22.577504 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564150-lks4l" event={"ID":"4b6ad1a0-39d2-4efa-b976-6879406394d3","Type":"ContainerDied","Data":"67687a9e1a40fad3c084ffa78922a2daf272549d0af7985cbea14e5473be0f9e"} Mar 18 15:50:23 crc kubenswrapper[4696]: I0318 15:50:23.856870 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564150-lks4l" Mar 18 15:50:23 crc kubenswrapper[4696]: I0318 15:50:23.952692 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5ncq\" (UniqueName: \"kubernetes.io/projected/4b6ad1a0-39d2-4efa-b976-6879406394d3-kube-api-access-q5ncq\") pod \"4b6ad1a0-39d2-4efa-b976-6879406394d3\" (UID: \"4b6ad1a0-39d2-4efa-b976-6879406394d3\") " Mar 18 15:50:23 crc kubenswrapper[4696]: I0318 15:50:23.959119 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b6ad1a0-39d2-4efa-b976-6879406394d3-kube-api-access-q5ncq" (OuterVolumeSpecName: "kube-api-access-q5ncq") pod "4b6ad1a0-39d2-4efa-b976-6879406394d3" (UID: "4b6ad1a0-39d2-4efa-b976-6879406394d3"). InnerVolumeSpecName "kube-api-access-q5ncq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:50:24 crc kubenswrapper[4696]: I0318 15:50:24.054749 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5ncq\" (UniqueName: \"kubernetes.io/projected/4b6ad1a0-39d2-4efa-b976-6879406394d3-kube-api-access-q5ncq\") on node \"crc\" DevicePath \"\"" Mar 18 15:50:24 crc kubenswrapper[4696]: I0318 15:50:24.594387 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564150-lks4l" event={"ID":"4b6ad1a0-39d2-4efa-b976-6879406394d3","Type":"ContainerDied","Data":"9185129af21fd9af96b97e45ebe77ccecb8550f753fb4cd9b10f6fc6290f321a"} Mar 18 15:50:24 crc kubenswrapper[4696]: I0318 15:50:24.595004 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9185129af21fd9af96b97e45ebe77ccecb8550f753fb4cd9b10f6fc6290f321a" Mar 18 15:50:24 crc kubenswrapper[4696]: I0318 15:50:24.594445 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564150-lks4l" Mar 18 15:50:24 crc kubenswrapper[4696]: I0318 15:50:24.642887 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564144-dmqz5"] Mar 18 15:50:24 crc kubenswrapper[4696]: I0318 15:50:24.647350 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564144-dmqz5"] Mar 18 15:50:25 crc kubenswrapper[4696]: I0318 15:50:25.603409 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b71ed20f-912f-45b5-9b48-8932987d8bb0" path="/var/lib/kubelet/pods/b71ed20f-912f-45b5-9b48-8932987d8bb0/volumes" Mar 18 15:50:30 crc kubenswrapper[4696]: I0318 15:50:30.017493 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-574tr" Mar 18 15:50:46 crc kubenswrapper[4696]: I0318 15:50:46.305878 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh"] Mar 18 15:50:46 crc kubenswrapper[4696]: E0318 15:50:46.306625 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b6ad1a0-39d2-4efa-b976-6879406394d3" containerName="oc" Mar 18 15:50:46 crc kubenswrapper[4696]: I0318 15:50:46.306638 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b6ad1a0-39d2-4efa-b976-6879406394d3" containerName="oc" Mar 18 15:50:46 crc kubenswrapper[4696]: I0318 15:50:46.306732 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b6ad1a0-39d2-4efa-b976-6879406394d3" containerName="oc" Mar 18 15:50:46 crc kubenswrapper[4696]: I0318 15:50:46.307418 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh" Mar 18 15:50:46 crc kubenswrapper[4696]: I0318 15:50:46.310118 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 15:50:46 crc kubenswrapper[4696]: I0318 15:50:46.318206 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh"] Mar 18 15:50:46 crc kubenswrapper[4696]: I0318 15:50:46.374279 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/60e108b1-f55a-4685-8219-7be977826f05-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh\" (UID: \"60e108b1-f55a-4685-8219-7be977826f05\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh" Mar 18 15:50:46 crc kubenswrapper[4696]: I0318 15:50:46.374395 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs297\" (UniqueName: \"kubernetes.io/projected/60e108b1-f55a-4685-8219-7be977826f05-kube-api-access-hs297\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh\" (UID: \"60e108b1-f55a-4685-8219-7be977826f05\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh" Mar 18 15:50:46 crc kubenswrapper[4696]: I0318 15:50:46.374454 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/60e108b1-f55a-4685-8219-7be977826f05-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh\" (UID: \"60e108b1-f55a-4685-8219-7be977826f05\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh" Mar 18 15:50:46 crc kubenswrapper[4696]: I0318 15:50:46.475582 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/60e108b1-f55a-4685-8219-7be977826f05-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh\" (UID: \"60e108b1-f55a-4685-8219-7be977826f05\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh" Mar 18 15:50:46 crc kubenswrapper[4696]: I0318 15:50:46.475665 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/60e108b1-f55a-4685-8219-7be977826f05-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh\" (UID: \"60e108b1-f55a-4685-8219-7be977826f05\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh" Mar 18 15:50:46 crc kubenswrapper[4696]: I0318 15:50:46.475739 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs297\" (UniqueName: \"kubernetes.io/projected/60e108b1-f55a-4685-8219-7be977826f05-kube-api-access-hs297\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh\" (UID: \"60e108b1-f55a-4685-8219-7be977826f05\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh" Mar 18 15:50:46 crc kubenswrapper[4696]: I0318 15:50:46.476177 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/60e108b1-f55a-4685-8219-7be977826f05-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh\" (UID: \"60e108b1-f55a-4685-8219-7be977826f05\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh" Mar 18 15:50:46 crc kubenswrapper[4696]: I0318 15:50:46.476186 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/60e108b1-f55a-4685-8219-7be977826f05-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh\" (UID: \"60e108b1-f55a-4685-8219-7be977826f05\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh" Mar 18 15:50:46 crc kubenswrapper[4696]: I0318 15:50:46.499557 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs297\" (UniqueName: \"kubernetes.io/projected/60e108b1-f55a-4685-8219-7be977826f05-kube-api-access-hs297\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh\" (UID: \"60e108b1-f55a-4685-8219-7be977826f05\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh" Mar 18 15:50:46 crc kubenswrapper[4696]: I0318 15:50:46.625481 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh" Mar 18 15:50:47 crc kubenswrapper[4696]: I0318 15:50:47.053114 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh"] Mar 18 15:50:47 crc kubenswrapper[4696]: I0318 15:50:47.756676 4696 generic.go:334] "Generic (PLEG): container finished" podID="60e108b1-f55a-4685-8219-7be977826f05" containerID="ddf27e2291ce899ad1484056b6a721e5ba7450a28eaeaef3aa66bb181617a0d8" exitCode=0 Mar 18 15:50:47 crc kubenswrapper[4696]: I0318 15:50:47.756814 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh" event={"ID":"60e108b1-f55a-4685-8219-7be977826f05","Type":"ContainerDied","Data":"ddf27e2291ce899ad1484056b6a721e5ba7450a28eaeaef3aa66bb181617a0d8"} Mar 18 15:50:47 crc kubenswrapper[4696]: I0318 15:50:47.757096 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh" event={"ID":"60e108b1-f55a-4685-8219-7be977826f05","Type":"ContainerStarted","Data":"26e237dd5cb1290060510267a656cfd151e7ac6a6bb549d2d801a59d4fe8e7ca"} Mar 18 15:50:48 crc kubenswrapper[4696]: I0318 15:50:48.265381 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hm95v"] Mar 18 15:50:48 crc kubenswrapper[4696]: I0318 15:50:48.267000 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hm95v" Mar 18 15:50:48 crc kubenswrapper[4696]: I0318 15:50:48.288081 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hm95v"] Mar 18 15:50:48 crc kubenswrapper[4696]: I0318 15:50:48.302914 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009-catalog-content\") pod \"redhat-operators-hm95v\" (UID: \"fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009\") " pod="openshift-marketplace/redhat-operators-hm95v" Mar 18 15:50:48 crc kubenswrapper[4696]: I0318 15:50:48.302965 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009-utilities\") pod \"redhat-operators-hm95v\" (UID: \"fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009\") " pod="openshift-marketplace/redhat-operators-hm95v" Mar 18 15:50:48 crc kubenswrapper[4696]: I0318 15:50:48.303004 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5pzh\" (UniqueName: \"kubernetes.io/projected/fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009-kube-api-access-r5pzh\") pod \"redhat-operators-hm95v\" (UID: \"fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009\") " pod="openshift-marketplace/redhat-operators-hm95v" Mar 18 15:50:48 crc kubenswrapper[4696]: I0318 15:50:48.404177 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009-catalog-content\") pod \"redhat-operators-hm95v\" (UID: \"fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009\") " pod="openshift-marketplace/redhat-operators-hm95v" Mar 18 15:50:48 crc kubenswrapper[4696]: I0318 15:50:48.404222 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009-utilities\") pod \"redhat-operators-hm95v\" (UID: \"fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009\") " pod="openshift-marketplace/redhat-operators-hm95v" Mar 18 15:50:48 crc kubenswrapper[4696]: I0318 15:50:48.404254 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5pzh\" (UniqueName: \"kubernetes.io/projected/fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009-kube-api-access-r5pzh\") pod \"redhat-operators-hm95v\" (UID: \"fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009\") " pod="openshift-marketplace/redhat-operators-hm95v" Mar 18 15:50:48 crc kubenswrapper[4696]: I0318 15:50:48.404731 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009-catalog-content\") pod \"redhat-operators-hm95v\" (UID: \"fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009\") " pod="openshift-marketplace/redhat-operators-hm95v" Mar 18 15:50:48 crc kubenswrapper[4696]: I0318 15:50:48.404965 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009-utilities\") pod \"redhat-operators-hm95v\" (UID: \"fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009\") " pod="openshift-marketplace/redhat-operators-hm95v" Mar 18 15:50:48 crc kubenswrapper[4696]: I0318 15:50:48.423305 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5pzh\" (UniqueName: \"kubernetes.io/projected/fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009-kube-api-access-r5pzh\") pod \"redhat-operators-hm95v\" (UID: \"fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009\") " pod="openshift-marketplace/redhat-operators-hm95v" Mar 18 15:50:48 crc kubenswrapper[4696]: I0318 15:50:48.603877 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hm95v" Mar 18 15:50:49 crc kubenswrapper[4696]: I0318 15:50:49.040373 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hm95v"] Mar 18 15:50:49 crc kubenswrapper[4696]: W0318 15:50:49.043795 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd64e9c8_3ba2_49f0_9bf8_6bc3518ad009.slice/crio-96c79fc3d9ebf67ebef62c358bc4b4d72a5b43a6e7866085e08780bebafbc54c WatchSource:0}: Error finding container 96c79fc3d9ebf67ebef62c358bc4b4d72a5b43a6e7866085e08780bebafbc54c: Status 404 returned error can't find the container with id 96c79fc3d9ebf67ebef62c358bc4b4d72a5b43a6e7866085e08780bebafbc54c Mar 18 15:50:49 crc kubenswrapper[4696]: I0318 15:50:49.772159 4696 generic.go:334] "Generic (PLEG): container finished" podID="fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009" containerID="4172fa769697beb73e302db270a861d88a8c4ec21ef557a99a688a4ba92da1d3" exitCode=0 Mar 18 15:50:49 crc kubenswrapper[4696]: I0318 15:50:49.772241 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hm95v" event={"ID":"fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009","Type":"ContainerDied","Data":"4172fa769697beb73e302db270a861d88a8c4ec21ef557a99a688a4ba92da1d3"} Mar 18 15:50:49 crc kubenswrapper[4696]: I0318 15:50:49.772882 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hm95v" event={"ID":"fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009","Type":"ContainerStarted","Data":"96c79fc3d9ebf67ebef62c358bc4b4d72a5b43a6e7866085e08780bebafbc54c"} Mar 18 15:50:50 crc kubenswrapper[4696]: I0318 15:50:50.785924 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hm95v" event={"ID":"fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009","Type":"ContainerStarted","Data":"9994812f5dab7f0b3a57ebd12768039c112b6878e85957d0e00f6276bc682b5b"} Mar 18 15:50:50 crc kubenswrapper[4696]: I0318 15:50:50.789789 4696 generic.go:334] "Generic (PLEG): container finished" podID="60e108b1-f55a-4685-8219-7be977826f05" containerID="d48d9fccda42b22a937ac04f86e8fa36332e193e8c8e46892fe345554def991b" exitCode=0 Mar 18 15:50:50 crc kubenswrapper[4696]: I0318 15:50:50.789850 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh" event={"ID":"60e108b1-f55a-4685-8219-7be977826f05","Type":"ContainerDied","Data":"d48d9fccda42b22a937ac04f86e8fa36332e193e8c8e46892fe345554def991b"} Mar 18 15:50:51 crc kubenswrapper[4696]: I0318 15:50:51.797700 4696 generic.go:334] "Generic (PLEG): container finished" podID="60e108b1-f55a-4685-8219-7be977826f05" containerID="d0eec1e063c99ae797ac600db72b6101198c9fbc1a812a63e7255a25248c69a0" exitCode=0 Mar 18 15:50:51 crc kubenswrapper[4696]: I0318 15:50:51.797772 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh" event={"ID":"60e108b1-f55a-4685-8219-7be977826f05","Type":"ContainerDied","Data":"d0eec1e063c99ae797ac600db72b6101198c9fbc1a812a63e7255a25248c69a0"} Mar 18 15:50:52 crc kubenswrapper[4696]: I0318 15:50:52.822082 4696 generic.go:334] "Generic (PLEG): container finished" podID="fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009" containerID="9994812f5dab7f0b3a57ebd12768039c112b6878e85957d0e00f6276bc682b5b" exitCode=0 Mar 18 15:50:52 crc kubenswrapper[4696]: I0318 15:50:52.822281 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hm95v" event={"ID":"fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009","Type":"ContainerDied","Data":"9994812f5dab7f0b3a57ebd12768039c112b6878e85957d0e00f6276bc682b5b"} Mar 18 15:50:53 crc kubenswrapper[4696]: I0318 15:50:53.042165 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh" Mar 18 15:50:53 crc kubenswrapper[4696]: I0318 15:50:53.070096 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/60e108b1-f55a-4685-8219-7be977826f05-bundle\") pod \"60e108b1-f55a-4685-8219-7be977826f05\" (UID: \"60e108b1-f55a-4685-8219-7be977826f05\") " Mar 18 15:50:53 crc kubenswrapper[4696]: I0318 15:50:53.070246 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/60e108b1-f55a-4685-8219-7be977826f05-util\") pod \"60e108b1-f55a-4685-8219-7be977826f05\" (UID: \"60e108b1-f55a-4685-8219-7be977826f05\") " Mar 18 15:50:53 crc kubenswrapper[4696]: I0318 15:50:53.070322 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs297\" (UniqueName: \"kubernetes.io/projected/60e108b1-f55a-4685-8219-7be977826f05-kube-api-access-hs297\") pod \"60e108b1-f55a-4685-8219-7be977826f05\" (UID: \"60e108b1-f55a-4685-8219-7be977826f05\") " Mar 18 15:50:53 crc kubenswrapper[4696]: I0318 15:50:53.070833 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60e108b1-f55a-4685-8219-7be977826f05-bundle" (OuterVolumeSpecName: "bundle") pod "60e108b1-f55a-4685-8219-7be977826f05" (UID: "60e108b1-f55a-4685-8219-7be977826f05"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:50:53 crc kubenswrapper[4696]: I0318 15:50:53.077050 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60e108b1-f55a-4685-8219-7be977826f05-kube-api-access-hs297" (OuterVolumeSpecName: "kube-api-access-hs297") pod "60e108b1-f55a-4685-8219-7be977826f05" (UID: "60e108b1-f55a-4685-8219-7be977826f05"). InnerVolumeSpecName "kube-api-access-hs297". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:50:53 crc kubenswrapper[4696]: I0318 15:50:53.082131 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60e108b1-f55a-4685-8219-7be977826f05-util" (OuterVolumeSpecName: "util") pod "60e108b1-f55a-4685-8219-7be977826f05" (UID: "60e108b1-f55a-4685-8219-7be977826f05"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:50:53 crc kubenswrapper[4696]: I0318 15:50:53.172180 4696 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/60e108b1-f55a-4685-8219-7be977826f05-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:50:53 crc kubenswrapper[4696]: I0318 15:50:53.172245 4696 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/60e108b1-f55a-4685-8219-7be977826f05-util\") on node \"crc\" DevicePath \"\"" Mar 18 15:50:53 crc kubenswrapper[4696]: I0318 15:50:53.172264 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs297\" (UniqueName: \"kubernetes.io/projected/60e108b1-f55a-4685-8219-7be977826f05-kube-api-access-hs297\") on node \"crc\" DevicePath \"\"" Mar 18 15:50:53 crc kubenswrapper[4696]: I0318 15:50:53.832059 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hm95v" event={"ID":"fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009","Type":"ContainerStarted","Data":"32847bd8bd987b786381d8f32d67be5b70fb5475b4ffccf7a55471144537f48c"} Mar 18 15:50:53 crc kubenswrapper[4696]: I0318 15:50:53.835902 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh" event={"ID":"60e108b1-f55a-4685-8219-7be977826f05","Type":"ContainerDied","Data":"26e237dd5cb1290060510267a656cfd151e7ac6a6bb549d2d801a59d4fe8e7ca"} Mar 18 15:50:53 crc kubenswrapper[4696]: I0318 15:50:53.835954 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26e237dd5cb1290060510267a656cfd151e7ac6a6bb549d2d801a59d4fe8e7ca" Mar 18 15:50:53 crc kubenswrapper[4696]: I0318 15:50:53.835957 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh" Mar 18 15:50:53 crc kubenswrapper[4696]: I0318 15:50:53.858751 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hm95v" podStartSLOduration=2.254729211 podStartE2EDuration="5.858725854s" podCreationTimestamp="2026-03-18 15:50:48 +0000 UTC" firstStartedPulling="2026-03-18 15:50:49.820257672 +0000 UTC m=+892.826431878" lastFinishedPulling="2026-03-18 15:50:53.424254305 +0000 UTC m=+896.430428521" observedRunningTime="2026-03-18 15:50:53.856940339 +0000 UTC m=+896.863114545" watchObservedRunningTime="2026-03-18 15:50:53.858725854 +0000 UTC m=+896.864900060" Mar 18 15:50:56 crc kubenswrapper[4696]: I0318 15:50:56.575876 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-f8s88"] Mar 18 15:50:56 crc kubenswrapper[4696]: E0318 15:50:56.576768 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e108b1-f55a-4685-8219-7be977826f05" containerName="util" Mar 18 15:50:56 crc kubenswrapper[4696]: I0318 15:50:56.576788 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e108b1-f55a-4685-8219-7be977826f05" containerName="util" Mar 18 15:50:56 crc kubenswrapper[4696]: E0318 15:50:56.576816 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e108b1-f55a-4685-8219-7be977826f05" containerName="pull" Mar 18 15:50:56 crc kubenswrapper[4696]: I0318 15:50:56.576827 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e108b1-f55a-4685-8219-7be977826f05" containerName="pull" Mar 18 15:50:56 crc kubenswrapper[4696]: E0318 15:50:56.576845 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e108b1-f55a-4685-8219-7be977826f05" containerName="extract" Mar 18 15:50:56 crc kubenswrapper[4696]: I0318 15:50:56.576855 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e108b1-f55a-4685-8219-7be977826f05" containerName="extract" Mar 18 15:50:56 crc kubenswrapper[4696]: I0318 15:50:56.576983 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="60e108b1-f55a-4685-8219-7be977826f05" containerName="extract" Mar 18 15:50:56 crc kubenswrapper[4696]: I0318 15:50:56.577611 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-f8s88" Mar 18 15:50:56 crc kubenswrapper[4696]: I0318 15:50:56.580371 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-jqph8" Mar 18 15:50:56 crc kubenswrapper[4696]: I0318 15:50:56.581559 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 18 15:50:56 crc kubenswrapper[4696]: I0318 15:50:56.581906 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 18 15:50:56 crc kubenswrapper[4696]: I0318 15:50:56.597115 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-f8s88"] Mar 18 15:50:56 crc kubenswrapper[4696]: I0318 15:50:56.621726 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjk22\" (UniqueName: \"kubernetes.io/projected/3c50fff4-65a2-49c1-997a-658bc72f1fe7-kube-api-access-gjk22\") pod \"nmstate-operator-796d4cfff4-f8s88\" (UID: \"3c50fff4-65a2-49c1-997a-658bc72f1fe7\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-f8s88" Mar 18 15:50:56 crc kubenswrapper[4696]: I0318 15:50:56.723067 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjk22\" (UniqueName: \"kubernetes.io/projected/3c50fff4-65a2-49c1-997a-658bc72f1fe7-kube-api-access-gjk22\") pod \"nmstate-operator-796d4cfff4-f8s88\" (UID: \"3c50fff4-65a2-49c1-997a-658bc72f1fe7\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-f8s88" Mar 18 15:50:56 crc kubenswrapper[4696]: I0318 15:50:56.749064 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjk22\" (UniqueName: \"kubernetes.io/projected/3c50fff4-65a2-49c1-997a-658bc72f1fe7-kube-api-access-gjk22\") pod \"nmstate-operator-796d4cfff4-f8s88\" (UID: \"3c50fff4-65a2-49c1-997a-658bc72f1fe7\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-f8s88" Mar 18 15:50:56 crc kubenswrapper[4696]: I0318 15:50:56.897960 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-f8s88" Mar 18 15:50:57 crc kubenswrapper[4696]: I0318 15:50:57.351410 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-f8s88"] Mar 18 15:50:57 crc kubenswrapper[4696]: I0318 15:50:57.863666 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-f8s88" event={"ID":"3c50fff4-65a2-49c1-997a-658bc72f1fe7","Type":"ContainerStarted","Data":"0aa39205772b75acea1e27687eee111dbf025e823227b4dde9e7941e383b8317"} Mar 18 15:50:58 crc kubenswrapper[4696]: I0318 15:50:58.604708 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hm95v" Mar 18 15:50:58 crc kubenswrapper[4696]: I0318 15:50:58.604769 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hm95v" Mar 18 15:50:59 crc kubenswrapper[4696]: I0318 15:50:59.649842 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hm95v" podUID="fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009" containerName="registry-server" probeResult="failure" output=< Mar 18 15:50:59 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Mar 18 15:50:59 crc kubenswrapper[4696]: > Mar 18 15:51:01 crc kubenswrapper[4696]: I0318 15:51:01.907437 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-f8s88" event={"ID":"3c50fff4-65a2-49c1-997a-658bc72f1fe7","Type":"ContainerStarted","Data":"12275107751f718298572f5ccc8bf965304786f9bf2631ab4834350f31628435"} Mar 18 15:51:01 crc kubenswrapper[4696]: I0318 15:51:01.925853 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-f8s88" podStartSLOduration=1.884653461 podStartE2EDuration="5.925809771s" podCreationTimestamp="2026-03-18 15:50:56 +0000 UTC" firstStartedPulling="2026-03-18 15:50:57.365379783 +0000 UTC m=+900.371553979" lastFinishedPulling="2026-03-18 15:51:01.406536083 +0000 UTC m=+904.412710289" observedRunningTime="2026-03-18 15:51:01.922824866 +0000 UTC m=+904.928999072" watchObservedRunningTime="2026-03-18 15:51:01.925809771 +0000 UTC m=+904.931983987" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.109648 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-jlrq4"] Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.111881 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-jlrq4" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.116406 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-tnfbz" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.138215 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-jlrq4"] Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.143042 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-pn2r6"] Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.144644 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-pn2r6" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.147251 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.151610 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-qtwr9"] Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.152717 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-qtwr9" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.155959 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4rdd\" (UniqueName: \"kubernetes.io/projected/90a85ff7-6f9a-40c4-b528-15f0c3739a2b-kube-api-access-r4rdd\") pod \"nmstate-metrics-9b8c8685d-jlrq4\" (UID: \"90a85ff7-6f9a-40c4-b528-15f0c3739a2b\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-jlrq4" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.204175 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-pn2r6"] Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.250240 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-pngnv"] Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.251262 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pngnv" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.253938 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.254212 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.254442 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-2rg5p" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.257288 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4rdd\" (UniqueName: \"kubernetes.io/projected/90a85ff7-6f9a-40c4-b528-15f0c3739a2b-kube-api-access-r4rdd\") pod \"nmstate-metrics-9b8c8685d-jlrq4\" (UID: \"90a85ff7-6f9a-40c4-b528-15f0c3739a2b\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-jlrq4" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.257365 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704-nmstate-lock\") pod \"nmstate-handler-qtwr9\" (UID: \"7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704\") " pod="openshift-nmstate/nmstate-handler-qtwr9" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.257406 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704-ovs-socket\") pod \"nmstate-handler-qtwr9\" (UID: \"7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704\") " pod="openshift-nmstate/nmstate-handler-qtwr9" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.257438 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d0117f34-5320-46c0-952f-54d4abacdce4-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-pn2r6\" (UID: \"d0117f34-5320-46c0-952f-54d4abacdce4\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-pn2r6" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.257459 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6k22\" (UniqueName: \"kubernetes.io/projected/7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704-kube-api-access-b6k22\") pod \"nmstate-handler-qtwr9\" (UID: \"7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704\") " pod="openshift-nmstate/nmstate-handler-qtwr9" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.257483 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704-dbus-socket\") pod \"nmstate-handler-qtwr9\" (UID: \"7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704\") " pod="openshift-nmstate/nmstate-handler-qtwr9" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.257506 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxjkb\" (UniqueName: \"kubernetes.io/projected/d0117f34-5320-46c0-952f-54d4abacdce4-kube-api-access-xxjkb\") pod \"nmstate-webhook-5f558f5558-pn2r6\" (UID: \"d0117f34-5320-46c0-952f-54d4abacdce4\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-pn2r6" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.281867 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-pngnv"] Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.294618 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4rdd\" (UniqueName: \"kubernetes.io/projected/90a85ff7-6f9a-40c4-b528-15f0c3739a2b-kube-api-access-r4rdd\") pod \"nmstate-metrics-9b8c8685d-jlrq4\" (UID: \"90a85ff7-6f9a-40c4-b528-15f0c3739a2b\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-jlrq4" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.359417 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d0117f34-5320-46c0-952f-54d4abacdce4-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-pn2r6\" (UID: \"d0117f34-5320-46c0-952f-54d4abacdce4\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-pn2r6" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.359480 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6k22\" (UniqueName: \"kubernetes.io/projected/7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704-kube-api-access-b6k22\") pod \"nmstate-handler-qtwr9\" (UID: \"7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704\") " pod="openshift-nmstate/nmstate-handler-qtwr9" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.359507 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7vw9\" (UniqueName: \"kubernetes.io/projected/ea18cdb1-cb1f-46b3-af17-e834b51c6803-kube-api-access-c7vw9\") pod \"nmstate-console-plugin-86f58fcf4-pngnv\" (UID: \"ea18cdb1-cb1f-46b3-af17-e834b51c6803\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pngnv" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.359552 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ea18cdb1-cb1f-46b3-af17-e834b51c6803-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-pngnv\" (UID: \"ea18cdb1-cb1f-46b3-af17-e834b51c6803\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pngnv" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.359576 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704-dbus-socket\") pod \"nmstate-handler-qtwr9\" (UID: \"7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704\") " pod="openshift-nmstate/nmstate-handler-qtwr9" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.359601 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea18cdb1-cb1f-46b3-af17-e834b51c6803-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-pngnv\" (UID: \"ea18cdb1-cb1f-46b3-af17-e834b51c6803\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pngnv" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.359619 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxjkb\" (UniqueName: \"kubernetes.io/projected/d0117f34-5320-46c0-952f-54d4abacdce4-kube-api-access-xxjkb\") pod \"nmstate-webhook-5f558f5558-pn2r6\" (UID: \"d0117f34-5320-46c0-952f-54d4abacdce4\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-pn2r6" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.359661 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704-nmstate-lock\") pod \"nmstate-handler-qtwr9\" (UID: \"7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704\") " pod="openshift-nmstate/nmstate-handler-qtwr9" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.359692 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704-ovs-socket\") pod \"nmstate-handler-qtwr9\" (UID: \"7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704\") " pod="openshift-nmstate/nmstate-handler-qtwr9" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.359774 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704-ovs-socket\") pod \"nmstate-handler-qtwr9\" (UID: \"7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704\") " pod="openshift-nmstate/nmstate-handler-qtwr9" Mar 18 15:51:06 crc kubenswrapper[4696]: E0318 15:51:06.359909 4696 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 18 15:51:06 crc kubenswrapper[4696]: E0318 15:51:06.359973 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0117f34-5320-46c0-952f-54d4abacdce4-tls-key-pair podName:d0117f34-5320-46c0-952f-54d4abacdce4 nodeName:}" failed. No retries permitted until 2026-03-18 15:51:06.859948553 +0000 UTC m=+909.866122759 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/d0117f34-5320-46c0-952f-54d4abacdce4-tls-key-pair") pod "nmstate-webhook-5f558f5558-pn2r6" (UID: "d0117f34-5320-46c0-952f-54d4abacdce4") : secret "openshift-nmstate-webhook" not found Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.360416 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704-nmstate-lock\") pod \"nmstate-handler-qtwr9\" (UID: \"7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704\") " pod="openshift-nmstate/nmstate-handler-qtwr9" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.360651 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704-dbus-socket\") pod \"nmstate-handler-qtwr9\" (UID: \"7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704\") " pod="openshift-nmstate/nmstate-handler-qtwr9" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.379990 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6k22\" (UniqueName: \"kubernetes.io/projected/7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704-kube-api-access-b6k22\") pod \"nmstate-handler-qtwr9\" (UID: \"7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704\") " pod="openshift-nmstate/nmstate-handler-qtwr9" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.401245 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxjkb\" (UniqueName: \"kubernetes.io/projected/d0117f34-5320-46c0-952f-54d4abacdce4-kube-api-access-xxjkb\") pod \"nmstate-webhook-5f558f5558-pn2r6\" (UID: \"d0117f34-5320-46c0-952f-54d4abacdce4\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-pn2r6" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.434076 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-jlrq4" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.457376 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-678558f478-2sc98"] Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.458327 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.460969 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7vw9\" (UniqueName: \"kubernetes.io/projected/ea18cdb1-cb1f-46b3-af17-e834b51c6803-kube-api-access-c7vw9\") pod \"nmstate-console-plugin-86f58fcf4-pngnv\" (UID: \"ea18cdb1-cb1f-46b3-af17-e834b51c6803\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pngnv" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.461017 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ea18cdb1-cb1f-46b3-af17-e834b51c6803-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-pngnv\" (UID: \"ea18cdb1-cb1f-46b3-af17-e834b51c6803\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pngnv" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.461050 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea18cdb1-cb1f-46b3-af17-e834b51c6803-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-pngnv\" (UID: \"ea18cdb1-cb1f-46b3-af17-e834b51c6803\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pngnv" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.462280 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ea18cdb1-cb1f-46b3-af17-e834b51c6803-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-pngnv\" (UID: \"ea18cdb1-cb1f-46b3-af17-e834b51c6803\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pngnv" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.466716 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea18cdb1-cb1f-46b3-af17-e834b51c6803-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-pngnv\" (UID: \"ea18cdb1-cb1f-46b3-af17-e834b51c6803\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pngnv" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.488478 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-678558f478-2sc98"] Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.491665 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7vw9\" (UniqueName: \"kubernetes.io/projected/ea18cdb1-cb1f-46b3-af17-e834b51c6803-kube-api-access-c7vw9\") pod \"nmstate-console-plugin-86f58fcf4-pngnv\" (UID: \"ea18cdb1-cb1f-46b3-af17-e834b51c6803\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pngnv" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.493811 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-qtwr9" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.562428 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eea7f7fb-acec-4693-94a5-b12f4400f1d9-trusted-ca-bundle\") pod \"console-678558f478-2sc98\" (UID: \"eea7f7fb-acec-4693-94a5-b12f4400f1d9\") " pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.562638 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eea7f7fb-acec-4693-94a5-b12f4400f1d9-console-serving-cert\") pod \"console-678558f478-2sc98\" (UID: \"eea7f7fb-acec-4693-94a5-b12f4400f1d9\") " pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.562662 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ll69\" (UniqueName: \"kubernetes.io/projected/eea7f7fb-acec-4693-94a5-b12f4400f1d9-kube-api-access-9ll69\") pod \"console-678558f478-2sc98\" (UID: \"eea7f7fb-acec-4693-94a5-b12f4400f1d9\") " pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.562714 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eea7f7fb-acec-4693-94a5-b12f4400f1d9-console-config\") pod \"console-678558f478-2sc98\" (UID: \"eea7f7fb-acec-4693-94a5-b12f4400f1d9\") " pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.562747 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eea7f7fb-acec-4693-94a5-b12f4400f1d9-oauth-serving-cert\") pod \"console-678558f478-2sc98\" (UID: \"eea7f7fb-acec-4693-94a5-b12f4400f1d9\") " pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.562829 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eea7f7fb-acec-4693-94a5-b12f4400f1d9-console-oauth-config\") pod \"console-678558f478-2sc98\" (UID: \"eea7f7fb-acec-4693-94a5-b12f4400f1d9\") " pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.562895 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eea7f7fb-acec-4693-94a5-b12f4400f1d9-service-ca\") pod \"console-678558f478-2sc98\" (UID: \"eea7f7fb-acec-4693-94a5-b12f4400f1d9\") " pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.571650 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pngnv" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.663872 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eea7f7fb-acec-4693-94a5-b12f4400f1d9-console-serving-cert\") pod \"console-678558f478-2sc98\" (UID: \"eea7f7fb-acec-4693-94a5-b12f4400f1d9\") " pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.663930 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ll69\" (UniqueName: \"kubernetes.io/projected/eea7f7fb-acec-4693-94a5-b12f4400f1d9-kube-api-access-9ll69\") pod \"console-678558f478-2sc98\" (UID: \"eea7f7fb-acec-4693-94a5-b12f4400f1d9\") " pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.663990 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eea7f7fb-acec-4693-94a5-b12f4400f1d9-console-config\") pod \"console-678558f478-2sc98\" (UID: \"eea7f7fb-acec-4693-94a5-b12f4400f1d9\") " pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.664026 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eea7f7fb-acec-4693-94a5-b12f4400f1d9-oauth-serving-cert\") pod \"console-678558f478-2sc98\" (UID: \"eea7f7fb-acec-4693-94a5-b12f4400f1d9\") " pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.664055 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eea7f7fb-acec-4693-94a5-b12f4400f1d9-console-oauth-config\") pod \"console-678558f478-2sc98\" (UID: \"eea7f7fb-acec-4693-94a5-b12f4400f1d9\") " pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.664090 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eea7f7fb-acec-4693-94a5-b12f4400f1d9-service-ca\") pod \"console-678558f478-2sc98\" (UID: \"eea7f7fb-acec-4693-94a5-b12f4400f1d9\") " pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.664127 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eea7f7fb-acec-4693-94a5-b12f4400f1d9-trusted-ca-bundle\") pod \"console-678558f478-2sc98\" (UID: \"eea7f7fb-acec-4693-94a5-b12f4400f1d9\") " pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.665553 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eea7f7fb-acec-4693-94a5-b12f4400f1d9-trusted-ca-bundle\") pod \"console-678558f478-2sc98\" (UID: \"eea7f7fb-acec-4693-94a5-b12f4400f1d9\") " pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.666285 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eea7f7fb-acec-4693-94a5-b12f4400f1d9-oauth-serving-cert\") pod \"console-678558f478-2sc98\" (UID: \"eea7f7fb-acec-4693-94a5-b12f4400f1d9\") " pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.667166 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eea7f7fb-acec-4693-94a5-b12f4400f1d9-service-ca\") pod \"console-678558f478-2sc98\" (UID: \"eea7f7fb-acec-4693-94a5-b12f4400f1d9\") " pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.667309 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eea7f7fb-acec-4693-94a5-b12f4400f1d9-console-config\") pod \"console-678558f478-2sc98\" (UID: \"eea7f7fb-acec-4693-94a5-b12f4400f1d9\") " pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.671792 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eea7f7fb-acec-4693-94a5-b12f4400f1d9-console-oauth-config\") pod \"console-678558f478-2sc98\" (UID: \"eea7f7fb-acec-4693-94a5-b12f4400f1d9\") " pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.672905 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eea7f7fb-acec-4693-94a5-b12f4400f1d9-console-serving-cert\") pod \"console-678558f478-2sc98\" (UID: \"eea7f7fb-acec-4693-94a5-b12f4400f1d9\") " pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.683027 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ll69\" (UniqueName: \"kubernetes.io/projected/eea7f7fb-acec-4693-94a5-b12f4400f1d9-kube-api-access-9ll69\") pod \"console-678558f478-2sc98\" (UID: \"eea7f7fb-acec-4693-94a5-b12f4400f1d9\") " pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.788512 4696 scope.go:117] "RemoveContainer" containerID="9f7a9e75a3711fcc8ccd1a5f8e4dd218fc618fb224979b9ecff4a4fe716c241d" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.837729 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-pngnv"] Mar 18 15:51:06 crc kubenswrapper[4696]: W0318 15:51:06.840098 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea18cdb1_cb1f_46b3_af17_e834b51c6803.slice/crio-41d729f51881ca371bf2f6de439babdcb658cd7d4f1503374faa264d53fd65e5 WatchSource:0}: Error finding container 41d729f51881ca371bf2f6de439babdcb658cd7d4f1503374faa264d53fd65e5: Status 404 returned error can't find the container with id 41d729f51881ca371bf2f6de439babdcb658cd7d4f1503374faa264d53fd65e5 Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.840173 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.866571 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d0117f34-5320-46c0-952f-54d4abacdce4-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-pn2r6\" (UID: \"d0117f34-5320-46c0-952f-54d4abacdce4\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-pn2r6" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.869902 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d0117f34-5320-46c0-952f-54d4abacdce4-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-pn2r6\" (UID: \"d0117f34-5320-46c0-952f-54d4abacdce4\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-pn2r6" Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.927308 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-jlrq4"] Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.940118 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-qtwr9" event={"ID":"7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704","Type":"ContainerStarted","Data":"cc6976cebcda7983454525af8ccce5921fa42ce04729d2e137fc23cd4b424226"} Mar 18 15:51:06 crc kubenswrapper[4696]: I0318 15:51:06.941348 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pngnv" event={"ID":"ea18cdb1-cb1f-46b3-af17-e834b51c6803","Type":"ContainerStarted","Data":"41d729f51881ca371bf2f6de439babdcb658cd7d4f1503374faa264d53fd65e5"} Mar 18 15:51:07 crc kubenswrapper[4696]: I0318 15:51:07.081364 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-pn2r6" Mar 18 15:51:07 crc kubenswrapper[4696]: I0318 15:51:07.255393 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-678558f478-2sc98"] Mar 18 15:51:07 crc kubenswrapper[4696]: W0318 15:51:07.265731 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeea7f7fb_acec_4693_94a5_b12f4400f1d9.slice/crio-a92c52e67397828a286752ee10177bf67e8010c7e9ec3015f19e107449d3d741 WatchSource:0}: Error finding container a92c52e67397828a286752ee10177bf67e8010c7e9ec3015f19e107449d3d741: Status 404 returned error can't find the container with id a92c52e67397828a286752ee10177bf67e8010c7e9ec3015f19e107449d3d741 Mar 18 15:51:07 crc kubenswrapper[4696]: I0318 15:51:07.293072 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-pn2r6"] Mar 18 15:51:07 crc kubenswrapper[4696]: W0318 15:51:07.300048 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0117f34_5320_46c0_952f_54d4abacdce4.slice/crio-97bf699f389e407f6d83234ee7d2dae9aa45bf2c8c7e716086271ade3aae78ae WatchSource:0}: Error finding container 97bf699f389e407f6d83234ee7d2dae9aa45bf2c8c7e716086271ade3aae78ae: Status 404 returned error can't find the container with id 97bf699f389e407f6d83234ee7d2dae9aa45bf2c8c7e716086271ade3aae78ae Mar 18 15:51:07 crc kubenswrapper[4696]: I0318 15:51:07.948254 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-678558f478-2sc98" event={"ID":"eea7f7fb-acec-4693-94a5-b12f4400f1d9","Type":"ContainerStarted","Data":"9cb48c249308734cf72daef419851c611d542bd9adf2525d172414d2b347f157"} Mar 18 15:51:07 crc kubenswrapper[4696]: I0318 15:51:07.948316 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-678558f478-2sc98" event={"ID":"eea7f7fb-acec-4693-94a5-b12f4400f1d9","Type":"ContainerStarted","Data":"a92c52e67397828a286752ee10177bf67e8010c7e9ec3015f19e107449d3d741"} Mar 18 15:51:07 crc kubenswrapper[4696]: I0318 15:51:07.949173 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-pn2r6" event={"ID":"d0117f34-5320-46c0-952f-54d4abacdce4","Type":"ContainerStarted","Data":"97bf699f389e407f6d83234ee7d2dae9aa45bf2c8c7e716086271ade3aae78ae"} Mar 18 15:51:07 crc kubenswrapper[4696]: I0318 15:51:07.950463 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-jlrq4" event={"ID":"90a85ff7-6f9a-40c4-b528-15f0c3739a2b","Type":"ContainerStarted","Data":"28ed1bc18906ecde45a81fc16ed07b63d7f31e97915df19e92c46b26f7a320b6"} Mar 18 15:51:07 crc kubenswrapper[4696]: I0318 15:51:07.969257 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-678558f478-2sc98" podStartSLOduration=1.969238064 podStartE2EDuration="1.969238064s" podCreationTimestamp="2026-03-18 15:51:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:51:07.967914041 +0000 UTC m=+910.974088247" watchObservedRunningTime="2026-03-18 15:51:07.969238064 +0000 UTC m=+910.975412270" Mar 18 15:51:08 crc kubenswrapper[4696]: I0318 15:51:08.681190 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hm95v" Mar 18 15:51:08 crc kubenswrapper[4696]: I0318 15:51:08.745026 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hm95v" Mar 18 15:51:08 crc kubenswrapper[4696]: I0318 15:51:08.911305 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hm95v"] Mar 18 15:51:09 crc kubenswrapper[4696]: I0318 15:51:09.960862 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hm95v" podUID="fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009" containerName="registry-server" containerID="cri-o://32847bd8bd987b786381d8f32d67be5b70fb5475b4ffccf7a55471144537f48c" gracePeriod=2 Mar 18 15:51:10 crc kubenswrapper[4696]: I0318 15:51:10.368108 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hm95v" Mar 18 15:51:10 crc kubenswrapper[4696]: I0318 15:51:10.420855 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009-catalog-content\") pod \"fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009\" (UID: \"fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009\") " Mar 18 15:51:10 crc kubenswrapper[4696]: I0318 15:51:10.420931 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5pzh\" (UniqueName: \"kubernetes.io/projected/fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009-kube-api-access-r5pzh\") pod \"fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009\" (UID: \"fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009\") " Mar 18 15:51:10 crc kubenswrapper[4696]: I0318 15:51:10.421164 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009-utilities\") pod \"fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009\" (UID: \"fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009\") " Mar 18 15:51:10 crc kubenswrapper[4696]: I0318 15:51:10.422198 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009-utilities" (OuterVolumeSpecName: "utilities") pod "fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009" (UID: "fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:51:10 crc kubenswrapper[4696]: I0318 15:51:10.425312 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009-kube-api-access-r5pzh" (OuterVolumeSpecName: "kube-api-access-r5pzh") pod "fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009" (UID: "fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009"). InnerVolumeSpecName "kube-api-access-r5pzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:51:10 crc kubenswrapper[4696]: I0318 15:51:10.522199 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5pzh\" (UniqueName: \"kubernetes.io/projected/fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009-kube-api-access-r5pzh\") on node \"crc\" DevicePath \"\"" Mar 18 15:51:10 crc kubenswrapper[4696]: I0318 15:51:10.522482 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:51:10 crc kubenswrapper[4696]: I0318 15:51:10.549858 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009" (UID: "fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:51:10 crc kubenswrapper[4696]: I0318 15:51:10.624173 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:51:10 crc kubenswrapper[4696]: I0318 15:51:10.970195 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-qtwr9" event={"ID":"7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704","Type":"ContainerStarted","Data":"cd519aaab081b03d98ac17881ff4fe854a1d0ffd2188cad2f2958bfeaf3122ba"} Mar 18 15:51:10 crc kubenswrapper[4696]: I0318 15:51:10.970320 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-qtwr9" Mar 18 15:51:10 crc kubenswrapper[4696]: I0318 15:51:10.973948 4696 generic.go:334] "Generic (PLEG): container finished" podID="fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009" containerID="32847bd8bd987b786381d8f32d67be5b70fb5475b4ffccf7a55471144537f48c" exitCode=0 Mar 18 15:51:10 crc kubenswrapper[4696]: I0318 15:51:10.974028 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hm95v" event={"ID":"fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009","Type":"ContainerDied","Data":"32847bd8bd987b786381d8f32d67be5b70fb5475b4ffccf7a55471144537f48c"} Mar 18 15:51:10 crc kubenswrapper[4696]: I0318 15:51:10.974060 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hm95v" event={"ID":"fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009","Type":"ContainerDied","Data":"96c79fc3d9ebf67ebef62c358bc4b4d72a5b43a6e7866085e08780bebafbc54c"} Mar 18 15:51:10 crc kubenswrapper[4696]: I0318 15:51:10.974082 4696 scope.go:117] "RemoveContainer" containerID="32847bd8bd987b786381d8f32d67be5b70fb5475b4ffccf7a55471144537f48c" Mar 18 15:51:10 crc kubenswrapper[4696]: I0318 15:51:10.975189 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hm95v" Mar 18 15:51:10 crc kubenswrapper[4696]: I0318 15:51:10.976005 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-pn2r6" event={"ID":"d0117f34-5320-46c0-952f-54d4abacdce4","Type":"ContainerStarted","Data":"abb417029118869c6fd1ad6c914f8d5e44aa28a7a34f828ddcc15650cdd056a3"} Mar 18 15:51:10 crc kubenswrapper[4696]: I0318 15:51:10.976151 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-pn2r6" Mar 18 15:51:10 crc kubenswrapper[4696]: I0318 15:51:10.977481 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-jlrq4" event={"ID":"90a85ff7-6f9a-40c4-b528-15f0c3739a2b","Type":"ContainerStarted","Data":"07c3e4d79f43c47e42ec1cbb50d97b71b8a618ea47f8a40b70138af939acc625"} Mar 18 15:51:10 crc kubenswrapper[4696]: I0318 15:51:10.981476 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pngnv" event={"ID":"ea18cdb1-cb1f-46b3-af17-e834b51c6803","Type":"ContainerStarted","Data":"50eac83b1fa2587c5f27795f0fa1ef89aedf6476dc4583eebf822403169f35ca"} Mar 18 15:51:10 crc kubenswrapper[4696]: I0318 15:51:10.992427 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-qtwr9" podStartSLOduration=1.5014612299999999 podStartE2EDuration="4.992408603s" podCreationTimestamp="2026-03-18 15:51:06 +0000 UTC" firstStartedPulling="2026-03-18 15:51:06.532755562 +0000 UTC m=+909.538929768" lastFinishedPulling="2026-03-18 15:51:10.023702935 +0000 UTC m=+913.029877141" observedRunningTime="2026-03-18 15:51:10.989268371 +0000 UTC m=+913.995442577" watchObservedRunningTime="2026-03-18 15:51:10.992408603 +0000 UTC m=+913.998582819" Mar 18 15:51:10 crc kubenswrapper[4696]: I0318 15:51:10.998395 4696 scope.go:117] "RemoveContainer" containerID="9994812f5dab7f0b3a57ebd12768039c112b6878e85957d0e00f6276bc682b5b" Mar 18 15:51:11 crc kubenswrapper[4696]: I0318 15:51:11.011697 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-pn2r6" podStartSLOduration=2.26640932 podStartE2EDuration="5.011680835s" podCreationTimestamp="2026-03-18 15:51:06 +0000 UTC" firstStartedPulling="2026-03-18 15:51:07.302693292 +0000 UTC m=+910.308867498" lastFinishedPulling="2026-03-18 15:51:10.047964807 +0000 UTC m=+913.054139013" observedRunningTime="2026-03-18 15:51:11.010133454 +0000 UTC m=+914.016307660" watchObservedRunningTime="2026-03-18 15:51:11.011680835 +0000 UTC m=+914.017855041" Mar 18 15:51:11 crc kubenswrapper[4696]: I0318 15:51:11.025451 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-pngnv" podStartSLOduration=1.866729474 podStartE2EDuration="5.025429807s" podCreationTimestamp="2026-03-18 15:51:06 +0000 UTC" firstStartedPulling="2026-03-18 15:51:06.845358296 +0000 UTC m=+909.851532502" lastFinishedPulling="2026-03-18 15:51:10.004058629 +0000 UTC m=+913.010232835" observedRunningTime="2026-03-18 15:51:11.02456623 +0000 UTC m=+914.030740456" watchObservedRunningTime="2026-03-18 15:51:11.025429807 +0000 UTC m=+914.031604013" Mar 18 15:51:11 crc kubenswrapper[4696]: I0318 15:51:11.043992 4696 scope.go:117] "RemoveContainer" containerID="4172fa769697beb73e302db270a861d88a8c4ec21ef557a99a688a4ba92da1d3" Mar 18 15:51:11 crc kubenswrapper[4696]: I0318 15:51:11.045504 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hm95v"] Mar 18 15:51:11 crc kubenswrapper[4696]: I0318 15:51:11.050216 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hm95v"] Mar 18 15:51:11 crc kubenswrapper[4696]: I0318 15:51:11.059424 4696 scope.go:117] "RemoveContainer" containerID="32847bd8bd987b786381d8f32d67be5b70fb5475b4ffccf7a55471144537f48c" Mar 18 15:51:11 crc kubenswrapper[4696]: E0318 15:51:11.059897 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32847bd8bd987b786381d8f32d67be5b70fb5475b4ffccf7a55471144537f48c\": container with ID starting with 32847bd8bd987b786381d8f32d67be5b70fb5475b4ffccf7a55471144537f48c not found: ID does not exist" containerID="32847bd8bd987b786381d8f32d67be5b70fb5475b4ffccf7a55471144537f48c" Mar 18 15:51:11 crc kubenswrapper[4696]: I0318 15:51:11.059965 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32847bd8bd987b786381d8f32d67be5b70fb5475b4ffccf7a55471144537f48c"} err="failed to get container status \"32847bd8bd987b786381d8f32d67be5b70fb5475b4ffccf7a55471144537f48c\": rpc error: code = NotFound desc = could not find container \"32847bd8bd987b786381d8f32d67be5b70fb5475b4ffccf7a55471144537f48c\": container with ID starting with 32847bd8bd987b786381d8f32d67be5b70fb5475b4ffccf7a55471144537f48c not found: ID does not exist" Mar 18 15:51:11 crc kubenswrapper[4696]: I0318 15:51:11.060003 4696 scope.go:117] "RemoveContainer" containerID="9994812f5dab7f0b3a57ebd12768039c112b6878e85957d0e00f6276bc682b5b" Mar 18 15:51:11 crc kubenswrapper[4696]: E0318 15:51:11.060588 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9994812f5dab7f0b3a57ebd12768039c112b6878e85957d0e00f6276bc682b5b\": container with ID starting with 9994812f5dab7f0b3a57ebd12768039c112b6878e85957d0e00f6276bc682b5b not found: ID does not exist" containerID="9994812f5dab7f0b3a57ebd12768039c112b6878e85957d0e00f6276bc682b5b" Mar 18 15:51:11 crc kubenswrapper[4696]: I0318 15:51:11.060649 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9994812f5dab7f0b3a57ebd12768039c112b6878e85957d0e00f6276bc682b5b"} err="failed to get container status \"9994812f5dab7f0b3a57ebd12768039c112b6878e85957d0e00f6276bc682b5b\": rpc error: code = NotFound desc = could not find container \"9994812f5dab7f0b3a57ebd12768039c112b6878e85957d0e00f6276bc682b5b\": container with ID starting with 9994812f5dab7f0b3a57ebd12768039c112b6878e85957d0e00f6276bc682b5b not found: ID does not exist" Mar 18 15:51:11 crc kubenswrapper[4696]: I0318 15:51:11.060695 4696 scope.go:117] "RemoveContainer" containerID="4172fa769697beb73e302db270a861d88a8c4ec21ef557a99a688a4ba92da1d3" Mar 18 15:51:11 crc kubenswrapper[4696]: E0318 15:51:11.061061 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4172fa769697beb73e302db270a861d88a8c4ec21ef557a99a688a4ba92da1d3\": container with ID starting with 4172fa769697beb73e302db270a861d88a8c4ec21ef557a99a688a4ba92da1d3 not found: ID does not exist" containerID="4172fa769697beb73e302db270a861d88a8c4ec21ef557a99a688a4ba92da1d3" Mar 18 15:51:11 crc kubenswrapper[4696]: I0318 15:51:11.061124 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4172fa769697beb73e302db270a861d88a8c4ec21ef557a99a688a4ba92da1d3"} err="failed to get container status \"4172fa769697beb73e302db270a861d88a8c4ec21ef557a99a688a4ba92da1d3\": rpc error: code = NotFound desc = could not find container \"4172fa769697beb73e302db270a861d88a8c4ec21ef557a99a688a4ba92da1d3\": container with ID starting with 4172fa769697beb73e302db270a861d88a8c4ec21ef557a99a688a4ba92da1d3 not found: ID does not exist" Mar 18 15:51:11 crc kubenswrapper[4696]: I0318 15:51:11.609508 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009" path="/var/lib/kubelet/pods/fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009/volumes" Mar 18 15:51:12 crc kubenswrapper[4696]: I0318 15:51:12.997752 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-jlrq4" event={"ID":"90a85ff7-6f9a-40c4-b528-15f0c3739a2b","Type":"ContainerStarted","Data":"91806afc5db50092f3be83c30dd393f15f2e578949115375b1b3022950a45138"} Mar 18 15:51:13 crc kubenswrapper[4696]: I0318 15:51:13.020770 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-jlrq4" podStartSLOduration=1.279166746 podStartE2EDuration="7.020737224s" podCreationTimestamp="2026-03-18 15:51:06 +0000 UTC" firstStartedPulling="2026-03-18 15:51:06.934451013 +0000 UTC m=+909.940625229" lastFinishedPulling="2026-03-18 15:51:12.676021511 +0000 UTC m=+915.682195707" observedRunningTime="2026-03-18 15:51:13.012850308 +0000 UTC m=+916.019024514" watchObservedRunningTime="2026-03-18 15:51:13.020737224 +0000 UTC m=+916.026911430" Mar 18 15:51:16 crc kubenswrapper[4696]: I0318 15:51:16.525848 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-qtwr9" Mar 18 15:51:16 crc kubenswrapper[4696]: I0318 15:51:16.840597 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:16 crc kubenswrapper[4696]: I0318 15:51:16.841028 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:16 crc kubenswrapper[4696]: I0318 15:51:16.846258 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:17 crc kubenswrapper[4696]: I0318 15:51:17.025701 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-678558f478-2sc98" Mar 18 15:51:17 crc kubenswrapper[4696]: I0318 15:51:17.092835 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-qbzqg"] Mar 18 15:51:27 crc kubenswrapper[4696]: I0318 15:51:27.087938 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-pn2r6" Mar 18 15:51:40 crc kubenswrapper[4696]: I0318 15:51:40.037229 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft"] Mar 18 15:51:40 crc kubenswrapper[4696]: E0318 15:51:40.037986 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009" containerName="extract-utilities" Mar 18 15:51:40 crc kubenswrapper[4696]: I0318 15:51:40.037999 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009" containerName="extract-utilities" Mar 18 15:51:40 crc kubenswrapper[4696]: E0318 15:51:40.038009 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009" containerName="registry-server" Mar 18 15:51:40 crc kubenswrapper[4696]: I0318 15:51:40.038015 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009" containerName="registry-server" Mar 18 15:51:40 crc kubenswrapper[4696]: E0318 15:51:40.038028 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009" containerName="extract-content" Mar 18 15:51:40 crc kubenswrapper[4696]: I0318 15:51:40.038034 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009" containerName="extract-content" Mar 18 15:51:40 crc kubenswrapper[4696]: I0318 15:51:40.038123 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd64e9c8-3ba2-49f0-9bf8-6bc3518ad009" containerName="registry-server" Mar 18 15:51:40 crc kubenswrapper[4696]: I0318 15:51:40.038864 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft" Mar 18 15:51:40 crc kubenswrapper[4696]: I0318 15:51:40.041062 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 15:51:40 crc kubenswrapper[4696]: I0318 15:51:40.050922 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft"] Mar 18 15:51:40 crc kubenswrapper[4696]: I0318 15:51:40.082779 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/70134d43-b7f8-4a91-864c-9e680a9e8ae9-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft\" (UID: \"70134d43-b7f8-4a91-864c-9e680a9e8ae9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft" Mar 18 15:51:40 crc kubenswrapper[4696]: I0318 15:51:40.083090 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/70134d43-b7f8-4a91-864c-9e680a9e8ae9-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft\" (UID: \"70134d43-b7f8-4a91-864c-9e680a9e8ae9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft" Mar 18 15:51:40 crc kubenswrapper[4696]: I0318 15:51:40.083124 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94c8n\" (UniqueName: \"kubernetes.io/projected/70134d43-b7f8-4a91-864c-9e680a9e8ae9-kube-api-access-94c8n\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft\" (UID: \"70134d43-b7f8-4a91-864c-9e680a9e8ae9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft" Mar 18 15:51:40 crc kubenswrapper[4696]: I0318 15:51:40.184052 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/70134d43-b7f8-4a91-864c-9e680a9e8ae9-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft\" (UID: \"70134d43-b7f8-4a91-864c-9e680a9e8ae9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft" Mar 18 15:51:40 crc kubenswrapper[4696]: I0318 15:51:40.184108 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/70134d43-b7f8-4a91-864c-9e680a9e8ae9-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft\" (UID: \"70134d43-b7f8-4a91-864c-9e680a9e8ae9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft" Mar 18 15:51:40 crc kubenswrapper[4696]: I0318 15:51:40.184139 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94c8n\" (UniqueName: \"kubernetes.io/projected/70134d43-b7f8-4a91-864c-9e680a9e8ae9-kube-api-access-94c8n\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft\" (UID: \"70134d43-b7f8-4a91-864c-9e680a9e8ae9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft" Mar 18 15:51:40 crc kubenswrapper[4696]: I0318 15:51:40.184688 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/70134d43-b7f8-4a91-864c-9e680a9e8ae9-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft\" (UID: \"70134d43-b7f8-4a91-864c-9e680a9e8ae9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft" Mar 18 15:51:40 crc kubenswrapper[4696]: I0318 15:51:40.184744 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/70134d43-b7f8-4a91-864c-9e680a9e8ae9-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft\" (UID: \"70134d43-b7f8-4a91-864c-9e680a9e8ae9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft" Mar 18 15:51:40 crc kubenswrapper[4696]: I0318 15:51:40.202233 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94c8n\" (UniqueName: \"kubernetes.io/projected/70134d43-b7f8-4a91-864c-9e680a9e8ae9-kube-api-access-94c8n\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft\" (UID: \"70134d43-b7f8-4a91-864c-9e680a9e8ae9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft" Mar 18 15:51:40 crc kubenswrapper[4696]: I0318 15:51:40.358952 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft" Mar 18 15:51:40 crc kubenswrapper[4696]: I0318 15:51:40.745721 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft"] Mar 18 15:51:41 crc kubenswrapper[4696]: I0318 15:51:41.190368 4696 generic.go:334] "Generic (PLEG): container finished" podID="70134d43-b7f8-4a91-864c-9e680a9e8ae9" containerID="5a05467b55939156b4564588cb8556af5b58fbae31c948d5d3cdf9e1875646c4" exitCode=0 Mar 18 15:51:41 crc kubenswrapper[4696]: I0318 15:51:41.190419 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft" event={"ID":"70134d43-b7f8-4a91-864c-9e680a9e8ae9","Type":"ContainerDied","Data":"5a05467b55939156b4564588cb8556af5b58fbae31c948d5d3cdf9e1875646c4"} Mar 18 15:51:41 crc kubenswrapper[4696]: I0318 15:51:41.190447 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft" event={"ID":"70134d43-b7f8-4a91-864c-9e680a9e8ae9","Type":"ContainerStarted","Data":"a99c9edc40c4392773805e1eb90a05852ffa83b30648ae8f1a7528b5c20e10ff"} Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.141253 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-qbzqg" podUID="3733dd99-82f2-4602-b0e2-ece3c16cd446" containerName="console" containerID="cri-o://fd96ab33d3dfaf55f17398e09133459f2f324fb276ebeacb70928223e74d619f" gracePeriod=15 Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.184409 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.184804 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.513652 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-qbzqg_3733dd99-82f2-4602-b0e2-ece3c16cd446/console/0.log" Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.513724 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.715440 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3733dd99-82f2-4602-b0e2-ece3c16cd446-console-serving-cert\") pod \"3733dd99-82f2-4602-b0e2-ece3c16cd446\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.715792 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3733dd99-82f2-4602-b0e2-ece3c16cd446-trusted-ca-bundle\") pod \"3733dd99-82f2-4602-b0e2-ece3c16cd446\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.715829 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3733dd99-82f2-4602-b0e2-ece3c16cd446-oauth-serving-cert\") pod \"3733dd99-82f2-4602-b0e2-ece3c16cd446\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.715858 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3733dd99-82f2-4602-b0e2-ece3c16cd446-console-config\") pod \"3733dd99-82f2-4602-b0e2-ece3c16cd446\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.715877 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3733dd99-82f2-4602-b0e2-ece3c16cd446-service-ca\") pod \"3733dd99-82f2-4602-b0e2-ece3c16cd446\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.715901 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3733dd99-82f2-4602-b0e2-ece3c16cd446-console-oauth-config\") pod \"3733dd99-82f2-4602-b0e2-ece3c16cd446\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.715937 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75qfg\" (UniqueName: \"kubernetes.io/projected/3733dd99-82f2-4602-b0e2-ece3c16cd446-kube-api-access-75qfg\") pod \"3733dd99-82f2-4602-b0e2-ece3c16cd446\" (UID: \"3733dd99-82f2-4602-b0e2-ece3c16cd446\") " Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.717009 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3733dd99-82f2-4602-b0e2-ece3c16cd446-console-config" (OuterVolumeSpecName: "console-config") pod "3733dd99-82f2-4602-b0e2-ece3c16cd446" (UID: "3733dd99-82f2-4602-b0e2-ece3c16cd446"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.717025 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3733dd99-82f2-4602-b0e2-ece3c16cd446-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3733dd99-82f2-4602-b0e2-ece3c16cd446" (UID: "3733dd99-82f2-4602-b0e2-ece3c16cd446"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.717038 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3733dd99-82f2-4602-b0e2-ece3c16cd446-service-ca" (OuterVolumeSpecName: "service-ca") pod "3733dd99-82f2-4602-b0e2-ece3c16cd446" (UID: "3733dd99-82f2-4602-b0e2-ece3c16cd446"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.717403 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3733dd99-82f2-4602-b0e2-ece3c16cd446-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3733dd99-82f2-4602-b0e2-ece3c16cd446" (UID: "3733dd99-82f2-4602-b0e2-ece3c16cd446"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.722050 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3733dd99-82f2-4602-b0e2-ece3c16cd446-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3733dd99-82f2-4602-b0e2-ece3c16cd446" (UID: "3733dd99-82f2-4602-b0e2-ece3c16cd446"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.722806 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3733dd99-82f2-4602-b0e2-ece3c16cd446-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3733dd99-82f2-4602-b0e2-ece3c16cd446" (UID: "3733dd99-82f2-4602-b0e2-ece3c16cd446"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.722867 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3733dd99-82f2-4602-b0e2-ece3c16cd446-kube-api-access-75qfg" (OuterVolumeSpecName: "kube-api-access-75qfg") pod "3733dd99-82f2-4602-b0e2-ece3c16cd446" (UID: "3733dd99-82f2-4602-b0e2-ece3c16cd446"). InnerVolumeSpecName "kube-api-access-75qfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.817108 4696 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3733dd99-82f2-4602-b0e2-ece3c16cd446-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.817373 4696 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3733dd99-82f2-4602-b0e2-ece3c16cd446-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.817470 4696 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3733dd99-82f2-4602-b0e2-ece3c16cd446-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.817550 4696 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3733dd99-82f2-4602-b0e2-ece3c16cd446-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.817607 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75qfg\" (UniqueName: \"kubernetes.io/projected/3733dd99-82f2-4602-b0e2-ece3c16cd446-kube-api-access-75qfg\") on node \"crc\" DevicePath \"\"" Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.817694 4696 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3733dd99-82f2-4602-b0e2-ece3c16cd446-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:51:42 crc kubenswrapper[4696]: I0318 15:51:42.817771 4696 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3733dd99-82f2-4602-b0e2-ece3c16cd446-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:51:43 crc kubenswrapper[4696]: I0318 15:51:43.202783 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-qbzqg_3733dd99-82f2-4602-b0e2-ece3c16cd446/console/0.log" Mar 18 15:51:43 crc kubenswrapper[4696]: I0318 15:51:43.202846 4696 generic.go:334] "Generic (PLEG): container finished" podID="3733dd99-82f2-4602-b0e2-ece3c16cd446" containerID="fd96ab33d3dfaf55f17398e09133459f2f324fb276ebeacb70928223e74d619f" exitCode=2 Mar 18 15:51:43 crc kubenswrapper[4696]: I0318 15:51:43.202927 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-qbzqg" Mar 18 15:51:43 crc kubenswrapper[4696]: I0318 15:51:43.202939 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-qbzqg" event={"ID":"3733dd99-82f2-4602-b0e2-ece3c16cd446","Type":"ContainerDied","Data":"fd96ab33d3dfaf55f17398e09133459f2f324fb276ebeacb70928223e74d619f"} Mar 18 15:51:43 crc kubenswrapper[4696]: I0318 15:51:43.203000 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-qbzqg" event={"ID":"3733dd99-82f2-4602-b0e2-ece3c16cd446","Type":"ContainerDied","Data":"d6fbf371b1d83b5ff607d9ca941119f047eccd764e55b0ea4fa53819d52df8a3"} Mar 18 15:51:43 crc kubenswrapper[4696]: I0318 15:51:43.203018 4696 scope.go:117] "RemoveContainer" containerID="fd96ab33d3dfaf55f17398e09133459f2f324fb276ebeacb70928223e74d619f" Mar 18 15:51:43 crc kubenswrapper[4696]: I0318 15:51:43.206114 4696 generic.go:334] "Generic (PLEG): container finished" podID="70134d43-b7f8-4a91-864c-9e680a9e8ae9" containerID="4cd8d5578bedabadb04dc10f2b8eab73ac25a275e1ef104dfc940c8b5dc116c4" exitCode=0 Mar 18 15:51:43 crc kubenswrapper[4696]: I0318 15:51:43.206168 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft" event={"ID":"70134d43-b7f8-4a91-864c-9e680a9e8ae9","Type":"ContainerDied","Data":"4cd8d5578bedabadb04dc10f2b8eab73ac25a275e1ef104dfc940c8b5dc116c4"} Mar 18 15:51:43 crc kubenswrapper[4696]: I0318 15:51:43.219318 4696 scope.go:117] "RemoveContainer" containerID="fd96ab33d3dfaf55f17398e09133459f2f324fb276ebeacb70928223e74d619f" Mar 18 15:51:43 crc kubenswrapper[4696]: E0318 15:51:43.219756 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd96ab33d3dfaf55f17398e09133459f2f324fb276ebeacb70928223e74d619f\": container with ID starting with fd96ab33d3dfaf55f17398e09133459f2f324fb276ebeacb70928223e74d619f not found: ID does not exist" containerID="fd96ab33d3dfaf55f17398e09133459f2f324fb276ebeacb70928223e74d619f" Mar 18 15:51:43 crc kubenswrapper[4696]: I0318 15:51:43.219787 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd96ab33d3dfaf55f17398e09133459f2f324fb276ebeacb70928223e74d619f"} err="failed to get container status \"fd96ab33d3dfaf55f17398e09133459f2f324fb276ebeacb70928223e74d619f\": rpc error: code = NotFound desc = could not find container \"fd96ab33d3dfaf55f17398e09133459f2f324fb276ebeacb70928223e74d619f\": container with ID starting with fd96ab33d3dfaf55f17398e09133459f2f324fb276ebeacb70928223e74d619f not found: ID does not exist" Mar 18 15:51:43 crc kubenswrapper[4696]: I0318 15:51:43.245097 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-qbzqg"] Mar 18 15:51:43 crc kubenswrapper[4696]: I0318 15:51:43.249917 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-qbzqg"] Mar 18 15:51:43 crc kubenswrapper[4696]: I0318 15:51:43.606419 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3733dd99-82f2-4602-b0e2-ece3c16cd446" path="/var/lib/kubelet/pods/3733dd99-82f2-4602-b0e2-ece3c16cd446/volumes" Mar 18 15:51:44 crc kubenswrapper[4696]: I0318 15:51:44.213691 4696 generic.go:334] "Generic (PLEG): container finished" podID="70134d43-b7f8-4a91-864c-9e680a9e8ae9" containerID="0dbf36c71693453fea617a1070b8a34f707bbc2eda4216c477cb52bfd70ced1a" exitCode=0 Mar 18 15:51:44 crc kubenswrapper[4696]: I0318 15:51:44.213763 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft" event={"ID":"70134d43-b7f8-4a91-864c-9e680a9e8ae9","Type":"ContainerDied","Data":"0dbf36c71693453fea617a1070b8a34f707bbc2eda4216c477cb52bfd70ced1a"} Mar 18 15:51:45 crc kubenswrapper[4696]: I0318 15:51:45.457547 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft" Mar 18 15:51:45 crc kubenswrapper[4696]: I0318 15:51:45.556939 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/70134d43-b7f8-4a91-864c-9e680a9e8ae9-bundle\") pod \"70134d43-b7f8-4a91-864c-9e680a9e8ae9\" (UID: \"70134d43-b7f8-4a91-864c-9e680a9e8ae9\") " Mar 18 15:51:45 crc kubenswrapper[4696]: I0318 15:51:45.557005 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94c8n\" (UniqueName: \"kubernetes.io/projected/70134d43-b7f8-4a91-864c-9e680a9e8ae9-kube-api-access-94c8n\") pod \"70134d43-b7f8-4a91-864c-9e680a9e8ae9\" (UID: \"70134d43-b7f8-4a91-864c-9e680a9e8ae9\") " Mar 18 15:51:45 crc kubenswrapper[4696]: I0318 15:51:45.557048 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/70134d43-b7f8-4a91-864c-9e680a9e8ae9-util\") pod \"70134d43-b7f8-4a91-864c-9e680a9e8ae9\" (UID: \"70134d43-b7f8-4a91-864c-9e680a9e8ae9\") " Mar 18 15:51:45 crc kubenswrapper[4696]: I0318 15:51:45.557947 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70134d43-b7f8-4a91-864c-9e680a9e8ae9-bundle" (OuterVolumeSpecName: "bundle") pod "70134d43-b7f8-4a91-864c-9e680a9e8ae9" (UID: "70134d43-b7f8-4a91-864c-9e680a9e8ae9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:51:45 crc kubenswrapper[4696]: I0318 15:51:45.563312 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70134d43-b7f8-4a91-864c-9e680a9e8ae9-kube-api-access-94c8n" (OuterVolumeSpecName: "kube-api-access-94c8n") pod "70134d43-b7f8-4a91-864c-9e680a9e8ae9" (UID: "70134d43-b7f8-4a91-864c-9e680a9e8ae9"). InnerVolumeSpecName "kube-api-access-94c8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:51:45 crc kubenswrapper[4696]: I0318 15:51:45.575170 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70134d43-b7f8-4a91-864c-9e680a9e8ae9-util" (OuterVolumeSpecName: "util") pod "70134d43-b7f8-4a91-864c-9e680a9e8ae9" (UID: "70134d43-b7f8-4a91-864c-9e680a9e8ae9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:51:45 crc kubenswrapper[4696]: I0318 15:51:45.658056 4696 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/70134d43-b7f8-4a91-864c-9e680a9e8ae9-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:51:45 crc kubenswrapper[4696]: I0318 15:51:45.658086 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94c8n\" (UniqueName: \"kubernetes.io/projected/70134d43-b7f8-4a91-864c-9e680a9e8ae9-kube-api-access-94c8n\") on node \"crc\" DevicePath \"\"" Mar 18 15:51:45 crc kubenswrapper[4696]: I0318 15:51:45.658098 4696 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/70134d43-b7f8-4a91-864c-9e680a9e8ae9-util\") on node \"crc\" DevicePath \"\"" Mar 18 15:51:46 crc kubenswrapper[4696]: I0318 15:51:46.235131 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft" event={"ID":"70134d43-b7f8-4a91-864c-9e680a9e8ae9","Type":"ContainerDied","Data":"a99c9edc40c4392773805e1eb90a05852ffa83b30648ae8f1a7528b5c20e10ff"} Mar 18 15:51:46 crc kubenswrapper[4696]: I0318 15:51:46.235173 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft" Mar 18 15:51:46 crc kubenswrapper[4696]: I0318 15:51:46.235177 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a99c9edc40c4392773805e1eb90a05852ffa83b30648ae8f1a7528b5c20e10ff" Mar 18 15:51:51 crc kubenswrapper[4696]: I0318 15:51:51.596921 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lcfdt"] Mar 18 15:51:51 crc kubenswrapper[4696]: E0318 15:51:51.597853 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70134d43-b7f8-4a91-864c-9e680a9e8ae9" containerName="pull" Mar 18 15:51:51 crc kubenswrapper[4696]: I0318 15:51:51.597866 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="70134d43-b7f8-4a91-864c-9e680a9e8ae9" containerName="pull" Mar 18 15:51:51 crc kubenswrapper[4696]: E0318 15:51:51.597877 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3733dd99-82f2-4602-b0e2-ece3c16cd446" containerName="console" Mar 18 15:51:51 crc kubenswrapper[4696]: I0318 15:51:51.597883 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3733dd99-82f2-4602-b0e2-ece3c16cd446" containerName="console" Mar 18 15:51:51 crc kubenswrapper[4696]: E0318 15:51:51.597901 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70134d43-b7f8-4a91-864c-9e680a9e8ae9" containerName="extract" Mar 18 15:51:51 crc kubenswrapper[4696]: I0318 15:51:51.597907 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="70134d43-b7f8-4a91-864c-9e680a9e8ae9" containerName="extract" Mar 18 15:51:51 crc kubenswrapper[4696]: E0318 15:51:51.597920 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70134d43-b7f8-4a91-864c-9e680a9e8ae9" containerName="util" Mar 18 15:51:51 crc kubenswrapper[4696]: I0318 15:51:51.597927 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="70134d43-b7f8-4a91-864c-9e680a9e8ae9" containerName="util" Mar 18 15:51:51 crc kubenswrapper[4696]: I0318 15:51:51.598044 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="70134d43-b7f8-4a91-864c-9e680a9e8ae9" containerName="extract" Mar 18 15:51:51 crc kubenswrapper[4696]: I0318 15:51:51.598054 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="3733dd99-82f2-4602-b0e2-ece3c16cd446" containerName="console" Mar 18 15:51:51 crc kubenswrapper[4696]: I0318 15:51:51.598984 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lcfdt" Mar 18 15:51:51 crc kubenswrapper[4696]: I0318 15:51:51.618649 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lcfdt"] Mar 18 15:51:51 crc kubenswrapper[4696]: I0318 15:51:51.625499 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c83acc-0ea1-4904-aeab-35b13fe3a92d-utilities\") pod \"certified-operators-lcfdt\" (UID: \"36c83acc-0ea1-4904-aeab-35b13fe3a92d\") " pod="openshift-marketplace/certified-operators-lcfdt" Mar 18 15:51:51 crc kubenswrapper[4696]: I0318 15:51:51.625570 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c83acc-0ea1-4904-aeab-35b13fe3a92d-catalog-content\") pod \"certified-operators-lcfdt\" (UID: \"36c83acc-0ea1-4904-aeab-35b13fe3a92d\") " pod="openshift-marketplace/certified-operators-lcfdt" Mar 18 15:51:51 crc kubenswrapper[4696]: I0318 15:51:51.625803 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbrqt\" (UniqueName: \"kubernetes.io/projected/36c83acc-0ea1-4904-aeab-35b13fe3a92d-kube-api-access-hbrqt\") pod \"certified-operators-lcfdt\" (UID: \"36c83acc-0ea1-4904-aeab-35b13fe3a92d\") " pod="openshift-marketplace/certified-operators-lcfdt" Mar 18 15:51:51 crc kubenswrapper[4696]: I0318 15:51:51.726781 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c83acc-0ea1-4904-aeab-35b13fe3a92d-utilities\") pod \"certified-operators-lcfdt\" (UID: \"36c83acc-0ea1-4904-aeab-35b13fe3a92d\") " pod="openshift-marketplace/certified-operators-lcfdt" Mar 18 15:51:51 crc kubenswrapper[4696]: I0318 15:51:51.726851 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c83acc-0ea1-4904-aeab-35b13fe3a92d-catalog-content\") pod \"certified-operators-lcfdt\" (UID: \"36c83acc-0ea1-4904-aeab-35b13fe3a92d\") " pod="openshift-marketplace/certified-operators-lcfdt" Mar 18 15:51:51 crc kubenswrapper[4696]: I0318 15:51:51.726945 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbrqt\" (UniqueName: \"kubernetes.io/projected/36c83acc-0ea1-4904-aeab-35b13fe3a92d-kube-api-access-hbrqt\") pod \"certified-operators-lcfdt\" (UID: \"36c83acc-0ea1-4904-aeab-35b13fe3a92d\") " pod="openshift-marketplace/certified-operators-lcfdt" Mar 18 15:51:51 crc kubenswrapper[4696]: I0318 15:51:51.727340 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c83acc-0ea1-4904-aeab-35b13fe3a92d-utilities\") pod \"certified-operators-lcfdt\" (UID: \"36c83acc-0ea1-4904-aeab-35b13fe3a92d\") " pod="openshift-marketplace/certified-operators-lcfdt" Mar 18 15:51:51 crc kubenswrapper[4696]: I0318 15:51:51.727394 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c83acc-0ea1-4904-aeab-35b13fe3a92d-catalog-content\") pod \"certified-operators-lcfdt\" (UID: \"36c83acc-0ea1-4904-aeab-35b13fe3a92d\") " pod="openshift-marketplace/certified-operators-lcfdt" Mar 18 15:51:51 crc kubenswrapper[4696]: I0318 15:51:51.748491 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbrqt\" (UniqueName: \"kubernetes.io/projected/36c83acc-0ea1-4904-aeab-35b13fe3a92d-kube-api-access-hbrqt\") pod \"certified-operators-lcfdt\" (UID: \"36c83acc-0ea1-4904-aeab-35b13fe3a92d\") " pod="openshift-marketplace/certified-operators-lcfdt" Mar 18 15:51:51 crc kubenswrapper[4696]: I0318 15:51:51.917129 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lcfdt" Mar 18 15:51:52 crc kubenswrapper[4696]: I0318 15:51:52.217221 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lcfdt"] Mar 18 15:51:52 crc kubenswrapper[4696]: I0318 15:51:52.271440 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lcfdt" event={"ID":"36c83acc-0ea1-4904-aeab-35b13fe3a92d","Type":"ContainerStarted","Data":"4b609eacd678af731e3ecbc3549ab644398f66302878a181664f93122c722335"} Mar 18 15:51:53 crc kubenswrapper[4696]: I0318 15:51:53.279279 4696 generic.go:334] "Generic (PLEG): container finished" podID="36c83acc-0ea1-4904-aeab-35b13fe3a92d" containerID="6d2b7c012576c40046b08aab7b0cf16e60d013b601a76ab84eebe9a5b9781c97" exitCode=0 Mar 18 15:51:53 crc kubenswrapper[4696]: I0318 15:51:53.279373 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lcfdt" event={"ID":"36c83acc-0ea1-4904-aeab-35b13fe3a92d","Type":"ContainerDied","Data":"6d2b7c012576c40046b08aab7b0cf16e60d013b601a76ab84eebe9a5b9781c97"} Mar 18 15:51:53 crc kubenswrapper[4696]: I0318 15:51:53.395841 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bv6ct"] Mar 18 15:51:53 crc kubenswrapper[4696]: I0318 15:51:53.397177 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bv6ct" Mar 18 15:51:53 crc kubenswrapper[4696]: I0318 15:51:53.409306 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bv6ct"] Mar 18 15:51:53 crc kubenswrapper[4696]: I0318 15:51:53.597317 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d68pr\" (UniqueName: \"kubernetes.io/projected/bf44929e-cf9c-4fac-95cf-aca629a8ba7a-kube-api-access-d68pr\") pod \"community-operators-bv6ct\" (UID: \"bf44929e-cf9c-4fac-95cf-aca629a8ba7a\") " pod="openshift-marketplace/community-operators-bv6ct" Mar 18 15:51:53 crc kubenswrapper[4696]: I0318 15:51:53.600869 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf44929e-cf9c-4fac-95cf-aca629a8ba7a-utilities\") pod \"community-operators-bv6ct\" (UID: \"bf44929e-cf9c-4fac-95cf-aca629a8ba7a\") " pod="openshift-marketplace/community-operators-bv6ct" Mar 18 15:51:53 crc kubenswrapper[4696]: I0318 15:51:53.600897 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf44929e-cf9c-4fac-95cf-aca629a8ba7a-catalog-content\") pod \"community-operators-bv6ct\" (UID: \"bf44929e-cf9c-4fac-95cf-aca629a8ba7a\") " pod="openshift-marketplace/community-operators-bv6ct" Mar 18 15:51:53 crc kubenswrapper[4696]: I0318 15:51:53.703025 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d68pr\" (UniqueName: \"kubernetes.io/projected/bf44929e-cf9c-4fac-95cf-aca629a8ba7a-kube-api-access-d68pr\") pod \"community-operators-bv6ct\" (UID: \"bf44929e-cf9c-4fac-95cf-aca629a8ba7a\") " pod="openshift-marketplace/community-operators-bv6ct" Mar 18 15:51:53 crc kubenswrapper[4696]: I0318 15:51:53.703085 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf44929e-cf9c-4fac-95cf-aca629a8ba7a-utilities\") pod \"community-operators-bv6ct\" (UID: \"bf44929e-cf9c-4fac-95cf-aca629a8ba7a\") " pod="openshift-marketplace/community-operators-bv6ct" Mar 18 15:51:53 crc kubenswrapper[4696]: I0318 15:51:53.703104 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf44929e-cf9c-4fac-95cf-aca629a8ba7a-catalog-content\") pod \"community-operators-bv6ct\" (UID: \"bf44929e-cf9c-4fac-95cf-aca629a8ba7a\") " pod="openshift-marketplace/community-operators-bv6ct" Mar 18 15:51:53 crc kubenswrapper[4696]: I0318 15:51:53.703675 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf44929e-cf9c-4fac-95cf-aca629a8ba7a-catalog-content\") pod \"community-operators-bv6ct\" (UID: \"bf44929e-cf9c-4fac-95cf-aca629a8ba7a\") " pod="openshift-marketplace/community-operators-bv6ct" Mar 18 15:51:53 crc kubenswrapper[4696]: I0318 15:51:53.703804 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf44929e-cf9c-4fac-95cf-aca629a8ba7a-utilities\") pod \"community-operators-bv6ct\" (UID: \"bf44929e-cf9c-4fac-95cf-aca629a8ba7a\") " pod="openshift-marketplace/community-operators-bv6ct" Mar 18 15:51:53 crc kubenswrapper[4696]: I0318 15:51:53.722476 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d68pr\" (UniqueName: \"kubernetes.io/projected/bf44929e-cf9c-4fac-95cf-aca629a8ba7a-kube-api-access-d68pr\") pod \"community-operators-bv6ct\" (UID: \"bf44929e-cf9c-4fac-95cf-aca629a8ba7a\") " pod="openshift-marketplace/community-operators-bv6ct" Mar 18 15:51:54 crc kubenswrapper[4696]: I0318 15:51:54.019021 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bv6ct" Mar 18 15:51:54 crc kubenswrapper[4696]: I0318 15:51:54.291819 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bv6ct"] Mar 18 15:51:54 crc kubenswrapper[4696]: W0318 15:51:54.297264 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf44929e_cf9c_4fac_95cf_aca629a8ba7a.slice/crio-5b49c545df1c0568656d84694cce7545ea4edabae34c2798e1f372693d1c10b0 WatchSource:0}: Error finding container 5b49c545df1c0568656d84694cce7545ea4edabae34c2798e1f372693d1c10b0: Status 404 returned error can't find the container with id 5b49c545df1c0568656d84694cce7545ea4edabae34c2798e1f372693d1c10b0 Mar 18 15:51:54 crc kubenswrapper[4696]: I0318 15:51:54.965670 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-76ff64997f-7v6kl"] Mar 18 15:51:54 crc kubenswrapper[4696]: I0318 15:51:54.966686 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-76ff64997f-7v6kl" Mar 18 15:51:54 crc kubenswrapper[4696]: I0318 15:51:54.968915 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 18 15:51:54 crc kubenswrapper[4696]: I0318 15:51:54.969010 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-wzjs4" Mar 18 15:51:54 crc kubenswrapper[4696]: I0318 15:51:54.969348 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 18 15:51:54 crc kubenswrapper[4696]: I0318 15:51:54.969602 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 18 15:51:54 crc kubenswrapper[4696]: I0318 15:51:54.970452 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.067018 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-76ff64997f-7v6kl"] Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.120405 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8smx9\" (UniqueName: \"kubernetes.io/projected/d823fa6b-b1c9-4c8e-9da9-49e457c2fae6-kube-api-access-8smx9\") pod \"metallb-operator-controller-manager-76ff64997f-7v6kl\" (UID: \"d823fa6b-b1c9-4c8e-9da9-49e457c2fae6\") " pod="metallb-system/metallb-operator-controller-manager-76ff64997f-7v6kl" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.120471 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d823fa6b-b1c9-4c8e-9da9-49e457c2fae6-webhook-cert\") pod \"metallb-operator-controller-manager-76ff64997f-7v6kl\" (UID: \"d823fa6b-b1c9-4c8e-9da9-49e457c2fae6\") " pod="metallb-system/metallb-operator-controller-manager-76ff64997f-7v6kl" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.120493 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d823fa6b-b1c9-4c8e-9da9-49e457c2fae6-apiservice-cert\") pod \"metallb-operator-controller-manager-76ff64997f-7v6kl\" (UID: \"d823fa6b-b1c9-4c8e-9da9-49e457c2fae6\") " pod="metallb-system/metallb-operator-controller-manager-76ff64997f-7v6kl" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.206721 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-c9479f99b-72fxd"] Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.207652 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-c9479f99b-72fxd" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.209933 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-vws9m" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.209934 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.221594 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d823fa6b-b1c9-4c8e-9da9-49e457c2fae6-apiservice-cert\") pod \"metallb-operator-controller-manager-76ff64997f-7v6kl\" (UID: \"d823fa6b-b1c9-4c8e-9da9-49e457c2fae6\") " pod="metallb-system/metallb-operator-controller-manager-76ff64997f-7v6kl" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.221660 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwpnr\" (UniqueName: \"kubernetes.io/projected/b28709b3-7641-45cc-9e79-9be140d2bcae-kube-api-access-mwpnr\") pod \"metallb-operator-webhook-server-c9479f99b-72fxd\" (UID: \"b28709b3-7641-45cc-9e79-9be140d2bcae\") " pod="metallb-system/metallb-operator-webhook-server-c9479f99b-72fxd" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.221695 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b28709b3-7641-45cc-9e79-9be140d2bcae-apiservice-cert\") pod \"metallb-operator-webhook-server-c9479f99b-72fxd\" (UID: \"b28709b3-7641-45cc-9e79-9be140d2bcae\") " pod="metallb-system/metallb-operator-webhook-server-c9479f99b-72fxd" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.221712 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b28709b3-7641-45cc-9e79-9be140d2bcae-webhook-cert\") pod \"metallb-operator-webhook-server-c9479f99b-72fxd\" (UID: \"b28709b3-7641-45cc-9e79-9be140d2bcae\") " pod="metallb-system/metallb-operator-webhook-server-c9479f99b-72fxd" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.221739 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8smx9\" (UniqueName: \"kubernetes.io/projected/d823fa6b-b1c9-4c8e-9da9-49e457c2fae6-kube-api-access-8smx9\") pod \"metallb-operator-controller-manager-76ff64997f-7v6kl\" (UID: \"d823fa6b-b1c9-4c8e-9da9-49e457c2fae6\") " pod="metallb-system/metallb-operator-controller-manager-76ff64997f-7v6kl" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.221769 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d823fa6b-b1c9-4c8e-9da9-49e457c2fae6-webhook-cert\") pod \"metallb-operator-controller-manager-76ff64997f-7v6kl\" (UID: \"d823fa6b-b1c9-4c8e-9da9-49e457c2fae6\") " pod="metallb-system/metallb-operator-controller-manager-76ff64997f-7v6kl" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.224955 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.225681 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-c9479f99b-72fxd"] Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.227574 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d823fa6b-b1c9-4c8e-9da9-49e457c2fae6-webhook-cert\") pod \"metallb-operator-controller-manager-76ff64997f-7v6kl\" (UID: \"d823fa6b-b1c9-4c8e-9da9-49e457c2fae6\") " pod="metallb-system/metallb-operator-controller-manager-76ff64997f-7v6kl" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.230966 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d823fa6b-b1c9-4c8e-9da9-49e457c2fae6-apiservice-cert\") pod \"metallb-operator-controller-manager-76ff64997f-7v6kl\" (UID: \"d823fa6b-b1c9-4c8e-9da9-49e457c2fae6\") " pod="metallb-system/metallb-operator-controller-manager-76ff64997f-7v6kl" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.286129 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8smx9\" (UniqueName: \"kubernetes.io/projected/d823fa6b-b1c9-4c8e-9da9-49e457c2fae6-kube-api-access-8smx9\") pod \"metallb-operator-controller-manager-76ff64997f-7v6kl\" (UID: \"d823fa6b-b1c9-4c8e-9da9-49e457c2fae6\") " pod="metallb-system/metallb-operator-controller-manager-76ff64997f-7v6kl" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.297868 4696 generic.go:334] "Generic (PLEG): container finished" podID="bf44929e-cf9c-4fac-95cf-aca629a8ba7a" containerID="c20cf9cefd2d8c8a7ea3e33a239ac91a5f9da2e24f49cb4febdc82ad64691c24" exitCode=0 Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.297927 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv6ct" event={"ID":"bf44929e-cf9c-4fac-95cf-aca629a8ba7a","Type":"ContainerDied","Data":"c20cf9cefd2d8c8a7ea3e33a239ac91a5f9da2e24f49cb4febdc82ad64691c24"} Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.297952 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv6ct" event={"ID":"bf44929e-cf9c-4fac-95cf-aca629a8ba7a","Type":"ContainerStarted","Data":"5b49c545df1c0568656d84694cce7545ea4edabae34c2798e1f372693d1c10b0"} Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.300106 4696 generic.go:334] "Generic (PLEG): container finished" podID="36c83acc-0ea1-4904-aeab-35b13fe3a92d" containerID="6b5baacfb7da43a53c46081eb56ba1ff0249a65608635695233063551511d8e7" exitCode=0 Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.300161 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lcfdt" event={"ID":"36c83acc-0ea1-4904-aeab-35b13fe3a92d","Type":"ContainerDied","Data":"6b5baacfb7da43a53c46081eb56ba1ff0249a65608635695233063551511d8e7"} Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.324644 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b28709b3-7641-45cc-9e79-9be140d2bcae-apiservice-cert\") pod \"metallb-operator-webhook-server-c9479f99b-72fxd\" (UID: \"b28709b3-7641-45cc-9e79-9be140d2bcae\") " pod="metallb-system/metallb-operator-webhook-server-c9479f99b-72fxd" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.325013 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b28709b3-7641-45cc-9e79-9be140d2bcae-webhook-cert\") pod \"metallb-operator-webhook-server-c9479f99b-72fxd\" (UID: \"b28709b3-7641-45cc-9e79-9be140d2bcae\") " pod="metallb-system/metallb-operator-webhook-server-c9479f99b-72fxd" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.325137 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwpnr\" (UniqueName: \"kubernetes.io/projected/b28709b3-7641-45cc-9e79-9be140d2bcae-kube-api-access-mwpnr\") pod \"metallb-operator-webhook-server-c9479f99b-72fxd\" (UID: \"b28709b3-7641-45cc-9e79-9be140d2bcae\") " pod="metallb-system/metallb-operator-webhook-server-c9479f99b-72fxd" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.329327 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b28709b3-7641-45cc-9e79-9be140d2bcae-apiservice-cert\") pod \"metallb-operator-webhook-server-c9479f99b-72fxd\" (UID: \"b28709b3-7641-45cc-9e79-9be140d2bcae\") " pod="metallb-system/metallb-operator-webhook-server-c9479f99b-72fxd" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.340280 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b28709b3-7641-45cc-9e79-9be140d2bcae-webhook-cert\") pod \"metallb-operator-webhook-server-c9479f99b-72fxd\" (UID: \"b28709b3-7641-45cc-9e79-9be140d2bcae\") " pod="metallb-system/metallb-operator-webhook-server-c9479f99b-72fxd" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.346399 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwpnr\" (UniqueName: \"kubernetes.io/projected/b28709b3-7641-45cc-9e79-9be140d2bcae-kube-api-access-mwpnr\") pod \"metallb-operator-webhook-server-c9479f99b-72fxd\" (UID: \"b28709b3-7641-45cc-9e79-9be140d2bcae\") " pod="metallb-system/metallb-operator-webhook-server-c9479f99b-72fxd" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.522095 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-c9479f99b-72fxd" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.583791 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-76ff64997f-7v6kl" Mar 18 15:51:55 crc kubenswrapper[4696]: I0318 15:51:55.859060 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-c9479f99b-72fxd"] Mar 18 15:51:56 crc kubenswrapper[4696]: I0318 15:51:56.037718 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-76ff64997f-7v6kl"] Mar 18 15:51:56 crc kubenswrapper[4696]: I0318 15:51:56.313281 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv6ct" event={"ID":"bf44929e-cf9c-4fac-95cf-aca629a8ba7a","Type":"ContainerStarted","Data":"b111f36bac02c41962b006dfe2542be9135cada12931aa46f702e8e9351f924e"} Mar 18 15:51:56 crc kubenswrapper[4696]: I0318 15:51:56.316452 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lcfdt" event={"ID":"36c83acc-0ea1-4904-aeab-35b13fe3a92d","Type":"ContainerStarted","Data":"d853eee5b3d9ed06e0fb7d3b77c23a8402fc657c84628d60ef4346211ffb0b21"} Mar 18 15:51:56 crc kubenswrapper[4696]: I0318 15:51:56.318346 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-c9479f99b-72fxd" event={"ID":"b28709b3-7641-45cc-9e79-9be140d2bcae","Type":"ContainerStarted","Data":"cb98207fdca8abf8abf3698c05e114d0d3592f5ead27fcade64432d419098bed"} Mar 18 15:51:56 crc kubenswrapper[4696]: I0318 15:51:56.319340 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-76ff64997f-7v6kl" event={"ID":"d823fa6b-b1c9-4c8e-9da9-49e457c2fae6","Type":"ContainerStarted","Data":"bb2d71d9656edcac84868fb4156499fadd2ff7d699a7da363621bf29f7c28775"} Mar 18 15:51:56 crc kubenswrapper[4696]: I0318 15:51:56.369427 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lcfdt" podStartSLOduration=2.845248387 podStartE2EDuration="5.369409606s" podCreationTimestamp="2026-03-18 15:51:51 +0000 UTC" firstStartedPulling="2026-03-18 15:51:53.282068983 +0000 UTC m=+956.288243189" lastFinishedPulling="2026-03-18 15:51:55.806230202 +0000 UTC m=+958.812404408" observedRunningTime="2026-03-18 15:51:56.36521789 +0000 UTC m=+959.371392106" watchObservedRunningTime="2026-03-18 15:51:56.369409606 +0000 UTC m=+959.375583822" Mar 18 15:51:57 crc kubenswrapper[4696]: I0318 15:51:57.327759 4696 generic.go:334] "Generic (PLEG): container finished" podID="bf44929e-cf9c-4fac-95cf-aca629a8ba7a" containerID="b111f36bac02c41962b006dfe2542be9135cada12931aa46f702e8e9351f924e" exitCode=0 Mar 18 15:51:57 crc kubenswrapper[4696]: I0318 15:51:57.328387 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv6ct" event={"ID":"bf44929e-cf9c-4fac-95cf-aca629a8ba7a","Type":"ContainerDied","Data":"b111f36bac02c41962b006dfe2542be9135cada12931aa46f702e8e9351f924e"} Mar 18 15:51:58 crc kubenswrapper[4696]: I0318 15:51:58.339377 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv6ct" event={"ID":"bf44929e-cf9c-4fac-95cf-aca629a8ba7a","Type":"ContainerStarted","Data":"42301adf7c2ef118db8e71b89e9141e5afc47ea832f8379065eb23f827d884cd"} Mar 18 15:51:58 crc kubenswrapper[4696]: I0318 15:51:58.367879 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bv6ct" podStartSLOduration=2.90286269 podStartE2EDuration="5.367860325s" podCreationTimestamp="2026-03-18 15:51:53 +0000 UTC" firstStartedPulling="2026-03-18 15:51:55.300639952 +0000 UTC m=+958.306814158" lastFinishedPulling="2026-03-18 15:51:57.765637587 +0000 UTC m=+960.771811793" observedRunningTime="2026-03-18 15:51:58.366783728 +0000 UTC m=+961.372957934" watchObservedRunningTime="2026-03-18 15:51:58.367860325 +0000 UTC m=+961.374034531" Mar 18 15:52:00 crc kubenswrapper[4696]: I0318 15:52:00.136452 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564152-55vm6"] Mar 18 15:52:00 crc kubenswrapper[4696]: I0318 15:52:00.137536 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564152-55vm6" Mar 18 15:52:00 crc kubenswrapper[4696]: I0318 15:52:00.141334 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:52:00 crc kubenswrapper[4696]: I0318 15:52:00.142056 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 15:52:00 crc kubenswrapper[4696]: I0318 15:52:00.143593 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:52:00 crc kubenswrapper[4696]: I0318 15:52:00.154231 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564152-55vm6"] Mar 18 15:52:00 crc kubenswrapper[4696]: I0318 15:52:00.216722 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4dz9\" (UniqueName: \"kubernetes.io/projected/929544d7-d084-4e0c-bfa6-442fbd5a3ab4-kube-api-access-s4dz9\") pod \"auto-csr-approver-29564152-55vm6\" (UID: \"929544d7-d084-4e0c-bfa6-442fbd5a3ab4\") " pod="openshift-infra/auto-csr-approver-29564152-55vm6" Mar 18 15:52:00 crc kubenswrapper[4696]: I0318 15:52:00.318291 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4dz9\" (UniqueName: \"kubernetes.io/projected/929544d7-d084-4e0c-bfa6-442fbd5a3ab4-kube-api-access-s4dz9\") pod \"auto-csr-approver-29564152-55vm6\" (UID: \"929544d7-d084-4e0c-bfa6-442fbd5a3ab4\") " pod="openshift-infra/auto-csr-approver-29564152-55vm6" Mar 18 15:52:00 crc kubenswrapper[4696]: I0318 15:52:00.345349 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4dz9\" (UniqueName: \"kubernetes.io/projected/929544d7-d084-4e0c-bfa6-442fbd5a3ab4-kube-api-access-s4dz9\") pod \"auto-csr-approver-29564152-55vm6\" (UID: \"929544d7-d084-4e0c-bfa6-442fbd5a3ab4\") " pod="openshift-infra/auto-csr-approver-29564152-55vm6" Mar 18 15:52:00 crc kubenswrapper[4696]: I0318 15:52:00.455167 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564152-55vm6" Mar 18 15:52:01 crc kubenswrapper[4696]: I0318 15:52:01.917618 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lcfdt" Mar 18 15:52:01 crc kubenswrapper[4696]: I0318 15:52:01.918069 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lcfdt" Mar 18 15:52:02 crc kubenswrapper[4696]: I0318 15:52:02.020902 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lcfdt" Mar 18 15:52:02 crc kubenswrapper[4696]: I0318 15:52:02.432210 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lcfdt" Mar 18 15:52:02 crc kubenswrapper[4696]: I0318 15:52:02.851052 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564152-55vm6"] Mar 18 15:52:03 crc kubenswrapper[4696]: I0318 15:52:03.405975 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-76ff64997f-7v6kl" event={"ID":"d823fa6b-b1c9-4c8e-9da9-49e457c2fae6","Type":"ContainerStarted","Data":"8d7326704d854ad62c3cfb020135b89499e4ccf84d2a767094f4b8aadc0821d8"} Mar 18 15:52:03 crc kubenswrapper[4696]: I0318 15:52:03.406294 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-76ff64997f-7v6kl" Mar 18 15:52:03 crc kubenswrapper[4696]: I0318 15:52:03.410267 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564152-55vm6" event={"ID":"929544d7-d084-4e0c-bfa6-442fbd5a3ab4","Type":"ContainerStarted","Data":"9e0098883a613521195cd30df792f63dec142394729b0ad904f061a769392302"} Mar 18 15:52:03 crc kubenswrapper[4696]: I0318 15:52:03.413574 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-c9479f99b-72fxd" event={"ID":"b28709b3-7641-45cc-9e79-9be140d2bcae","Type":"ContainerStarted","Data":"f8d33abff1b07e381fa978fd7a300d0e45c81dbc84e14ef7aee35e00af822388"} Mar 18 15:52:03 crc kubenswrapper[4696]: I0318 15:52:03.413683 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-c9479f99b-72fxd" Mar 18 15:52:03 crc kubenswrapper[4696]: I0318 15:52:03.449876 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-76ff64997f-7v6kl" podStartSLOduration=2.9283437919999997 podStartE2EDuration="9.449861821s" podCreationTimestamp="2026-03-18 15:51:54 +0000 UTC" firstStartedPulling="2026-03-18 15:51:56.06149114 +0000 UTC m=+959.067665346" lastFinishedPulling="2026-03-18 15:52:02.583009169 +0000 UTC m=+965.589183375" observedRunningTime="2026-03-18 15:52:03.448461175 +0000 UTC m=+966.454635401" watchObservedRunningTime="2026-03-18 15:52:03.449861821 +0000 UTC m=+966.456036027" Mar 18 15:52:03 crc kubenswrapper[4696]: I0318 15:52:03.493976 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-c9479f99b-72fxd" podStartSLOduration=1.830687489 podStartE2EDuration="8.493959613s" podCreationTimestamp="2026-03-18 15:51:55 +0000 UTC" firstStartedPulling="2026-03-18 15:51:55.938302323 +0000 UTC m=+958.944476539" lastFinishedPulling="2026-03-18 15:52:02.601574457 +0000 UTC m=+965.607748663" observedRunningTime="2026-03-18 15:52:03.492916567 +0000 UTC m=+966.499090773" watchObservedRunningTime="2026-03-18 15:52:03.493959613 +0000 UTC m=+966.500133819" Mar 18 15:52:03 crc kubenswrapper[4696]: I0318 15:52:03.987589 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lcfdt"] Mar 18 15:52:04 crc kubenswrapper[4696]: I0318 15:52:04.019320 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bv6ct" Mar 18 15:52:04 crc kubenswrapper[4696]: I0318 15:52:04.020220 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bv6ct" Mar 18 15:52:04 crc kubenswrapper[4696]: I0318 15:52:04.061558 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bv6ct" Mar 18 15:52:04 crc kubenswrapper[4696]: I0318 15:52:04.419064 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lcfdt" podUID="36c83acc-0ea1-4904-aeab-35b13fe3a92d" containerName="registry-server" containerID="cri-o://d853eee5b3d9ed06e0fb7d3b77c23a8402fc657c84628d60ef4346211ffb0b21" gracePeriod=2 Mar 18 15:52:04 crc kubenswrapper[4696]: I0318 15:52:04.463931 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bv6ct" Mar 18 15:52:04 crc kubenswrapper[4696]: I0318 15:52:04.798845 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lcfdt" Mar 18 15:52:04 crc kubenswrapper[4696]: I0318 15:52:04.992626 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbrqt\" (UniqueName: \"kubernetes.io/projected/36c83acc-0ea1-4904-aeab-35b13fe3a92d-kube-api-access-hbrqt\") pod \"36c83acc-0ea1-4904-aeab-35b13fe3a92d\" (UID: \"36c83acc-0ea1-4904-aeab-35b13fe3a92d\") " Mar 18 15:52:04 crc kubenswrapper[4696]: I0318 15:52:04.992687 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c83acc-0ea1-4904-aeab-35b13fe3a92d-utilities\") pod \"36c83acc-0ea1-4904-aeab-35b13fe3a92d\" (UID: \"36c83acc-0ea1-4904-aeab-35b13fe3a92d\") " Mar 18 15:52:04 crc kubenswrapper[4696]: I0318 15:52:04.992750 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c83acc-0ea1-4904-aeab-35b13fe3a92d-catalog-content\") pod \"36c83acc-0ea1-4904-aeab-35b13fe3a92d\" (UID: \"36c83acc-0ea1-4904-aeab-35b13fe3a92d\") " Mar 18 15:52:04 crc kubenswrapper[4696]: I0318 15:52:04.993496 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36c83acc-0ea1-4904-aeab-35b13fe3a92d-utilities" (OuterVolumeSpecName: "utilities") pod "36c83acc-0ea1-4904-aeab-35b13fe3a92d" (UID: "36c83acc-0ea1-4904-aeab-35b13fe3a92d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:52:05 crc kubenswrapper[4696]: I0318 15:52:05.006669 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36c83acc-0ea1-4904-aeab-35b13fe3a92d-kube-api-access-hbrqt" (OuterVolumeSpecName: "kube-api-access-hbrqt") pod "36c83acc-0ea1-4904-aeab-35b13fe3a92d" (UID: "36c83acc-0ea1-4904-aeab-35b13fe3a92d"). InnerVolumeSpecName "kube-api-access-hbrqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:52:05 crc kubenswrapper[4696]: I0318 15:52:05.049546 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36c83acc-0ea1-4904-aeab-35b13fe3a92d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36c83acc-0ea1-4904-aeab-35b13fe3a92d" (UID: "36c83acc-0ea1-4904-aeab-35b13fe3a92d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:52:05 crc kubenswrapper[4696]: I0318 15:52:05.095279 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbrqt\" (UniqueName: \"kubernetes.io/projected/36c83acc-0ea1-4904-aeab-35b13fe3a92d-kube-api-access-hbrqt\") on node \"crc\" DevicePath \"\"" Mar 18 15:52:05 crc kubenswrapper[4696]: I0318 15:52:05.095325 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c83acc-0ea1-4904-aeab-35b13fe3a92d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:52:05 crc kubenswrapper[4696]: I0318 15:52:05.095338 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c83acc-0ea1-4904-aeab-35b13fe3a92d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:52:05 crc kubenswrapper[4696]: I0318 15:52:05.426488 4696 generic.go:334] "Generic (PLEG): container finished" podID="929544d7-d084-4e0c-bfa6-442fbd5a3ab4" containerID="85d4d89a691ba2e1380ae2e2087cb08448358e08f1a924394cc63b30535917b9" exitCode=0 Mar 18 15:52:05 crc kubenswrapper[4696]: I0318 15:52:05.426583 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564152-55vm6" event={"ID":"929544d7-d084-4e0c-bfa6-442fbd5a3ab4","Type":"ContainerDied","Data":"85d4d89a691ba2e1380ae2e2087cb08448358e08f1a924394cc63b30535917b9"} Mar 18 15:52:05 crc kubenswrapper[4696]: I0318 15:52:05.429056 4696 generic.go:334] "Generic (PLEG): container finished" podID="36c83acc-0ea1-4904-aeab-35b13fe3a92d" containerID="d853eee5b3d9ed06e0fb7d3b77c23a8402fc657c84628d60ef4346211ffb0b21" exitCode=0 Mar 18 15:52:05 crc kubenswrapper[4696]: I0318 15:52:05.429107 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lcfdt" event={"ID":"36c83acc-0ea1-4904-aeab-35b13fe3a92d","Type":"ContainerDied","Data":"d853eee5b3d9ed06e0fb7d3b77c23a8402fc657c84628d60ef4346211ffb0b21"} Mar 18 15:52:05 crc kubenswrapper[4696]: I0318 15:52:05.429157 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lcfdt" event={"ID":"36c83acc-0ea1-4904-aeab-35b13fe3a92d","Type":"ContainerDied","Data":"4b609eacd678af731e3ecbc3549ab644398f66302878a181664f93122c722335"} Mar 18 15:52:05 crc kubenswrapper[4696]: I0318 15:52:05.429166 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lcfdt" Mar 18 15:52:05 crc kubenswrapper[4696]: I0318 15:52:05.429179 4696 scope.go:117] "RemoveContainer" containerID="d853eee5b3d9ed06e0fb7d3b77c23a8402fc657c84628d60ef4346211ffb0b21" Mar 18 15:52:05 crc kubenswrapper[4696]: I0318 15:52:05.452255 4696 scope.go:117] "RemoveContainer" containerID="6b5baacfb7da43a53c46081eb56ba1ff0249a65608635695233063551511d8e7" Mar 18 15:52:05 crc kubenswrapper[4696]: I0318 15:52:05.464924 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lcfdt"] Mar 18 15:52:05 crc kubenswrapper[4696]: I0318 15:52:05.468700 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lcfdt"] Mar 18 15:52:05 crc kubenswrapper[4696]: I0318 15:52:05.476027 4696 scope.go:117] "RemoveContainer" containerID="6d2b7c012576c40046b08aab7b0cf16e60d013b601a76ab84eebe9a5b9781c97" Mar 18 15:52:05 crc kubenswrapper[4696]: I0318 15:52:05.501304 4696 scope.go:117] "RemoveContainer" containerID="d853eee5b3d9ed06e0fb7d3b77c23a8402fc657c84628d60ef4346211ffb0b21" Mar 18 15:52:05 crc kubenswrapper[4696]: E0318 15:52:05.501860 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d853eee5b3d9ed06e0fb7d3b77c23a8402fc657c84628d60ef4346211ffb0b21\": container with ID starting with d853eee5b3d9ed06e0fb7d3b77c23a8402fc657c84628d60ef4346211ffb0b21 not found: ID does not exist" containerID="d853eee5b3d9ed06e0fb7d3b77c23a8402fc657c84628d60ef4346211ffb0b21" Mar 18 15:52:05 crc kubenswrapper[4696]: I0318 15:52:05.501937 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d853eee5b3d9ed06e0fb7d3b77c23a8402fc657c84628d60ef4346211ffb0b21"} err="failed to get container status \"d853eee5b3d9ed06e0fb7d3b77c23a8402fc657c84628d60ef4346211ffb0b21\": rpc error: code = NotFound desc = could not find container \"d853eee5b3d9ed06e0fb7d3b77c23a8402fc657c84628d60ef4346211ffb0b21\": container with ID starting with d853eee5b3d9ed06e0fb7d3b77c23a8402fc657c84628d60ef4346211ffb0b21 not found: ID does not exist" Mar 18 15:52:05 crc kubenswrapper[4696]: I0318 15:52:05.502001 4696 scope.go:117] "RemoveContainer" containerID="6b5baacfb7da43a53c46081eb56ba1ff0249a65608635695233063551511d8e7" Mar 18 15:52:05 crc kubenswrapper[4696]: E0318 15:52:05.502451 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b5baacfb7da43a53c46081eb56ba1ff0249a65608635695233063551511d8e7\": container with ID starting with 6b5baacfb7da43a53c46081eb56ba1ff0249a65608635695233063551511d8e7 not found: ID does not exist" containerID="6b5baacfb7da43a53c46081eb56ba1ff0249a65608635695233063551511d8e7" Mar 18 15:52:05 crc kubenswrapper[4696]: I0318 15:52:05.502491 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b5baacfb7da43a53c46081eb56ba1ff0249a65608635695233063551511d8e7"} err="failed to get container status \"6b5baacfb7da43a53c46081eb56ba1ff0249a65608635695233063551511d8e7\": rpc error: code = NotFound desc = could not find container \"6b5baacfb7da43a53c46081eb56ba1ff0249a65608635695233063551511d8e7\": container with ID starting with 6b5baacfb7da43a53c46081eb56ba1ff0249a65608635695233063551511d8e7 not found: ID does not exist" Mar 18 15:52:05 crc kubenswrapper[4696]: I0318 15:52:05.502535 4696 scope.go:117] "RemoveContainer" containerID="6d2b7c012576c40046b08aab7b0cf16e60d013b601a76ab84eebe9a5b9781c97" Mar 18 15:52:05 crc kubenswrapper[4696]: E0318 15:52:05.502901 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d2b7c012576c40046b08aab7b0cf16e60d013b601a76ab84eebe9a5b9781c97\": container with ID starting with 6d2b7c012576c40046b08aab7b0cf16e60d013b601a76ab84eebe9a5b9781c97 not found: ID does not exist" containerID="6d2b7c012576c40046b08aab7b0cf16e60d013b601a76ab84eebe9a5b9781c97" Mar 18 15:52:05 crc kubenswrapper[4696]: I0318 15:52:05.502938 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d2b7c012576c40046b08aab7b0cf16e60d013b601a76ab84eebe9a5b9781c97"} err="failed to get container status \"6d2b7c012576c40046b08aab7b0cf16e60d013b601a76ab84eebe9a5b9781c97\": rpc error: code = NotFound desc = could not find container \"6d2b7c012576c40046b08aab7b0cf16e60d013b601a76ab84eebe9a5b9781c97\": container with ID starting with 6d2b7c012576c40046b08aab7b0cf16e60d013b601a76ab84eebe9a5b9781c97 not found: ID does not exist" Mar 18 15:52:05 crc kubenswrapper[4696]: I0318 15:52:05.604850 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36c83acc-0ea1-4904-aeab-35b13fe3a92d" path="/var/lib/kubelet/pods/36c83acc-0ea1-4904-aeab-35b13fe3a92d/volumes" Mar 18 15:52:06 crc kubenswrapper[4696]: I0318 15:52:06.386620 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bv6ct"] Mar 18 15:52:06 crc kubenswrapper[4696]: I0318 15:52:06.737953 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564152-55vm6" Mar 18 15:52:06 crc kubenswrapper[4696]: I0318 15:52:06.936704 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4dz9\" (UniqueName: \"kubernetes.io/projected/929544d7-d084-4e0c-bfa6-442fbd5a3ab4-kube-api-access-s4dz9\") pod \"929544d7-d084-4e0c-bfa6-442fbd5a3ab4\" (UID: \"929544d7-d084-4e0c-bfa6-442fbd5a3ab4\") " Mar 18 15:52:06 crc kubenswrapper[4696]: I0318 15:52:06.952718 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/929544d7-d084-4e0c-bfa6-442fbd5a3ab4-kube-api-access-s4dz9" (OuterVolumeSpecName: "kube-api-access-s4dz9") pod "929544d7-d084-4e0c-bfa6-442fbd5a3ab4" (UID: "929544d7-d084-4e0c-bfa6-442fbd5a3ab4"). InnerVolumeSpecName "kube-api-access-s4dz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:52:07 crc kubenswrapper[4696]: I0318 15:52:07.038444 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4dz9\" (UniqueName: \"kubernetes.io/projected/929544d7-d084-4e0c-bfa6-442fbd5a3ab4-kube-api-access-s4dz9\") on node \"crc\" DevicePath \"\"" Mar 18 15:52:07 crc kubenswrapper[4696]: I0318 15:52:07.446452 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bv6ct" podUID="bf44929e-cf9c-4fac-95cf-aca629a8ba7a" containerName="registry-server" containerID="cri-o://42301adf7c2ef118db8e71b89e9141e5afc47ea832f8379065eb23f827d884cd" gracePeriod=2 Mar 18 15:52:07 crc kubenswrapper[4696]: I0318 15:52:07.446714 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564152-55vm6" Mar 18 15:52:07 crc kubenswrapper[4696]: I0318 15:52:07.446692 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564152-55vm6" event={"ID":"929544d7-d084-4e0c-bfa6-442fbd5a3ab4","Type":"ContainerDied","Data":"9e0098883a613521195cd30df792f63dec142394729b0ad904f061a769392302"} Mar 18 15:52:07 crc kubenswrapper[4696]: I0318 15:52:07.446781 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e0098883a613521195cd30df792f63dec142394729b0ad904f061a769392302" Mar 18 15:52:07 crc kubenswrapper[4696]: I0318 15:52:07.827618 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564146-6nclf"] Mar 18 15:52:07 crc kubenswrapper[4696]: I0318 15:52:07.831713 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564146-6nclf"] Mar 18 15:52:07 crc kubenswrapper[4696]: I0318 15:52:07.844493 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bv6ct" Mar 18 15:52:07 crc kubenswrapper[4696]: I0318 15:52:07.950618 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf44929e-cf9c-4fac-95cf-aca629a8ba7a-catalog-content\") pod \"bf44929e-cf9c-4fac-95cf-aca629a8ba7a\" (UID: \"bf44929e-cf9c-4fac-95cf-aca629a8ba7a\") " Mar 18 15:52:07 crc kubenswrapper[4696]: I0318 15:52:07.950772 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d68pr\" (UniqueName: \"kubernetes.io/projected/bf44929e-cf9c-4fac-95cf-aca629a8ba7a-kube-api-access-d68pr\") pod \"bf44929e-cf9c-4fac-95cf-aca629a8ba7a\" (UID: \"bf44929e-cf9c-4fac-95cf-aca629a8ba7a\") " Mar 18 15:52:07 crc kubenswrapper[4696]: I0318 15:52:07.950819 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf44929e-cf9c-4fac-95cf-aca629a8ba7a-utilities\") pod \"bf44929e-cf9c-4fac-95cf-aca629a8ba7a\" (UID: \"bf44929e-cf9c-4fac-95cf-aca629a8ba7a\") " Mar 18 15:52:07 crc kubenswrapper[4696]: I0318 15:52:07.951985 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf44929e-cf9c-4fac-95cf-aca629a8ba7a-utilities" (OuterVolumeSpecName: "utilities") pod "bf44929e-cf9c-4fac-95cf-aca629a8ba7a" (UID: "bf44929e-cf9c-4fac-95cf-aca629a8ba7a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:52:07 crc kubenswrapper[4696]: I0318 15:52:07.955548 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf44929e-cf9c-4fac-95cf-aca629a8ba7a-kube-api-access-d68pr" (OuterVolumeSpecName: "kube-api-access-d68pr") pod "bf44929e-cf9c-4fac-95cf-aca629a8ba7a" (UID: "bf44929e-cf9c-4fac-95cf-aca629a8ba7a"). InnerVolumeSpecName "kube-api-access-d68pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:52:08 crc kubenswrapper[4696]: I0318 15:52:08.000141 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf44929e-cf9c-4fac-95cf-aca629a8ba7a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf44929e-cf9c-4fac-95cf-aca629a8ba7a" (UID: "bf44929e-cf9c-4fac-95cf-aca629a8ba7a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:52:08 crc kubenswrapper[4696]: I0318 15:52:08.052216 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf44929e-cf9c-4fac-95cf-aca629a8ba7a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:52:08 crc kubenswrapper[4696]: I0318 15:52:08.052251 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d68pr\" (UniqueName: \"kubernetes.io/projected/bf44929e-cf9c-4fac-95cf-aca629a8ba7a-kube-api-access-d68pr\") on node \"crc\" DevicePath \"\"" Mar 18 15:52:08 crc kubenswrapper[4696]: I0318 15:52:08.052268 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf44929e-cf9c-4fac-95cf-aca629a8ba7a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:52:08 crc kubenswrapper[4696]: I0318 15:52:08.453852 4696 generic.go:334] "Generic (PLEG): container finished" podID="bf44929e-cf9c-4fac-95cf-aca629a8ba7a" containerID="42301adf7c2ef118db8e71b89e9141e5afc47ea832f8379065eb23f827d884cd" exitCode=0 Mar 18 15:52:08 crc kubenswrapper[4696]: I0318 15:52:08.453893 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv6ct" event={"ID":"bf44929e-cf9c-4fac-95cf-aca629a8ba7a","Type":"ContainerDied","Data":"42301adf7c2ef118db8e71b89e9141e5afc47ea832f8379065eb23f827d884cd"} Mar 18 15:52:08 crc kubenswrapper[4696]: I0318 15:52:08.453900 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bv6ct" Mar 18 15:52:08 crc kubenswrapper[4696]: I0318 15:52:08.453960 4696 scope.go:117] "RemoveContainer" containerID="42301adf7c2ef118db8e71b89e9141e5afc47ea832f8379065eb23f827d884cd" Mar 18 15:52:08 crc kubenswrapper[4696]: I0318 15:52:08.453947 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv6ct" event={"ID":"bf44929e-cf9c-4fac-95cf-aca629a8ba7a","Type":"ContainerDied","Data":"5b49c545df1c0568656d84694cce7545ea4edabae34c2798e1f372693d1c10b0"} Mar 18 15:52:08 crc kubenswrapper[4696]: I0318 15:52:08.478503 4696 scope.go:117] "RemoveContainer" containerID="b111f36bac02c41962b006dfe2542be9135cada12931aa46f702e8e9351f924e" Mar 18 15:52:08 crc kubenswrapper[4696]: I0318 15:52:08.486488 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bv6ct"] Mar 18 15:52:08 crc kubenswrapper[4696]: I0318 15:52:08.490742 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bv6ct"] Mar 18 15:52:08 crc kubenswrapper[4696]: I0318 15:52:08.511548 4696 scope.go:117] "RemoveContainer" containerID="c20cf9cefd2d8c8a7ea3e33a239ac91a5f9da2e24f49cb4febdc82ad64691c24" Mar 18 15:52:08 crc kubenswrapper[4696]: I0318 15:52:08.524910 4696 scope.go:117] "RemoveContainer" containerID="42301adf7c2ef118db8e71b89e9141e5afc47ea832f8379065eb23f827d884cd" Mar 18 15:52:08 crc kubenswrapper[4696]: E0318 15:52:08.525386 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42301adf7c2ef118db8e71b89e9141e5afc47ea832f8379065eb23f827d884cd\": container with ID starting with 42301adf7c2ef118db8e71b89e9141e5afc47ea832f8379065eb23f827d884cd not found: ID does not exist" containerID="42301adf7c2ef118db8e71b89e9141e5afc47ea832f8379065eb23f827d884cd" Mar 18 15:52:08 crc kubenswrapper[4696]: I0318 15:52:08.525422 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42301adf7c2ef118db8e71b89e9141e5afc47ea832f8379065eb23f827d884cd"} err="failed to get container status \"42301adf7c2ef118db8e71b89e9141e5afc47ea832f8379065eb23f827d884cd\": rpc error: code = NotFound desc = could not find container \"42301adf7c2ef118db8e71b89e9141e5afc47ea832f8379065eb23f827d884cd\": container with ID starting with 42301adf7c2ef118db8e71b89e9141e5afc47ea832f8379065eb23f827d884cd not found: ID does not exist" Mar 18 15:52:08 crc kubenswrapper[4696]: I0318 15:52:08.525448 4696 scope.go:117] "RemoveContainer" containerID="b111f36bac02c41962b006dfe2542be9135cada12931aa46f702e8e9351f924e" Mar 18 15:52:08 crc kubenswrapper[4696]: E0318 15:52:08.525818 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b111f36bac02c41962b006dfe2542be9135cada12931aa46f702e8e9351f924e\": container with ID starting with b111f36bac02c41962b006dfe2542be9135cada12931aa46f702e8e9351f924e not found: ID does not exist" containerID="b111f36bac02c41962b006dfe2542be9135cada12931aa46f702e8e9351f924e" Mar 18 15:52:08 crc kubenswrapper[4696]: I0318 15:52:08.525847 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b111f36bac02c41962b006dfe2542be9135cada12931aa46f702e8e9351f924e"} err="failed to get container status \"b111f36bac02c41962b006dfe2542be9135cada12931aa46f702e8e9351f924e\": rpc error: code = NotFound desc = could not find container \"b111f36bac02c41962b006dfe2542be9135cada12931aa46f702e8e9351f924e\": container with ID starting with b111f36bac02c41962b006dfe2542be9135cada12931aa46f702e8e9351f924e not found: ID does not exist" Mar 18 15:52:08 crc kubenswrapper[4696]: I0318 15:52:08.525865 4696 scope.go:117] "RemoveContainer" containerID="c20cf9cefd2d8c8a7ea3e33a239ac91a5f9da2e24f49cb4febdc82ad64691c24" Mar 18 15:52:08 crc kubenswrapper[4696]: E0318 15:52:08.526328 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c20cf9cefd2d8c8a7ea3e33a239ac91a5f9da2e24f49cb4febdc82ad64691c24\": container with ID starting with c20cf9cefd2d8c8a7ea3e33a239ac91a5f9da2e24f49cb4febdc82ad64691c24 not found: ID does not exist" containerID="c20cf9cefd2d8c8a7ea3e33a239ac91a5f9da2e24f49cb4febdc82ad64691c24" Mar 18 15:52:08 crc kubenswrapper[4696]: I0318 15:52:08.526352 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c20cf9cefd2d8c8a7ea3e33a239ac91a5f9da2e24f49cb4febdc82ad64691c24"} err="failed to get container status \"c20cf9cefd2d8c8a7ea3e33a239ac91a5f9da2e24f49cb4febdc82ad64691c24\": rpc error: code = NotFound desc = could not find container \"c20cf9cefd2d8c8a7ea3e33a239ac91a5f9da2e24f49cb4febdc82ad64691c24\": container with ID starting with c20cf9cefd2d8c8a7ea3e33a239ac91a5f9da2e24f49cb4febdc82ad64691c24 not found: ID does not exist" Mar 18 15:52:09 crc kubenswrapper[4696]: I0318 15:52:09.605028 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="106545a2-6035-42b7-ac34-45bcbb0451ed" path="/var/lib/kubelet/pods/106545a2-6035-42b7-ac34-45bcbb0451ed/volumes" Mar 18 15:52:09 crc kubenswrapper[4696]: I0318 15:52:09.606809 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf44929e-cf9c-4fac-95cf-aca629a8ba7a" path="/var/lib/kubelet/pods/bf44929e-cf9c-4fac-95cf-aca629a8ba7a/volumes" Mar 18 15:52:12 crc kubenswrapper[4696]: I0318 15:52:12.184445 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:52:12 crc kubenswrapper[4696]: I0318 15:52:12.184541 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:52:15 crc kubenswrapper[4696]: I0318 15:52:15.527020 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-c9479f99b-72fxd" Mar 18 15:52:35 crc kubenswrapper[4696]: I0318 15:52:35.587013 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-76ff64997f-7v6kl" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.199628 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-whtkt"] Mar 18 15:52:36 crc kubenswrapper[4696]: E0318 15:52:36.199975 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c83acc-0ea1-4904-aeab-35b13fe3a92d" containerName="extract-content" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.199993 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c83acc-0ea1-4904-aeab-35b13fe3a92d" containerName="extract-content" Mar 18 15:52:36 crc kubenswrapper[4696]: E0318 15:52:36.200006 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="929544d7-d084-4e0c-bfa6-442fbd5a3ab4" containerName="oc" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.200016 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="929544d7-d084-4e0c-bfa6-442fbd5a3ab4" containerName="oc" Mar 18 15:52:36 crc kubenswrapper[4696]: E0318 15:52:36.200039 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf44929e-cf9c-4fac-95cf-aca629a8ba7a" containerName="registry-server" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.200047 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf44929e-cf9c-4fac-95cf-aca629a8ba7a" containerName="registry-server" Mar 18 15:52:36 crc kubenswrapper[4696]: E0318 15:52:36.200061 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c83acc-0ea1-4904-aeab-35b13fe3a92d" containerName="registry-server" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.200068 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c83acc-0ea1-4904-aeab-35b13fe3a92d" containerName="registry-server" Mar 18 15:52:36 crc kubenswrapper[4696]: E0318 15:52:36.200079 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c83acc-0ea1-4904-aeab-35b13fe3a92d" containerName="extract-utilities" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.200087 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c83acc-0ea1-4904-aeab-35b13fe3a92d" containerName="extract-utilities" Mar 18 15:52:36 crc kubenswrapper[4696]: E0318 15:52:36.200097 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf44929e-cf9c-4fac-95cf-aca629a8ba7a" containerName="extract-content" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.200104 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf44929e-cf9c-4fac-95cf-aca629a8ba7a" containerName="extract-content" Mar 18 15:52:36 crc kubenswrapper[4696]: E0318 15:52:36.200115 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf44929e-cf9c-4fac-95cf-aca629a8ba7a" containerName="extract-utilities" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.200121 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf44929e-cf9c-4fac-95cf-aca629a8ba7a" containerName="extract-utilities" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.200229 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf44929e-cf9c-4fac-95cf-aca629a8ba7a" containerName="registry-server" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.200241 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="36c83acc-0ea1-4904-aeab-35b13fe3a92d" containerName="registry-server" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.200252 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="929544d7-d084-4e0c-bfa6-442fbd5a3ab4" containerName="oc" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.200821 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-whtkt" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.202955 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.202971 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-svpsc" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.204774 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-r98m9"] Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.210188 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.212426 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.212434 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-whtkt"] Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.216894 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.272740 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nfqc\" (UniqueName: \"kubernetes.io/projected/ccff3265-0675-4907-bcae-b20d0ebddd56-kube-api-access-4nfqc\") pod \"frr-k8s-r98m9\" (UID: \"ccff3265-0675-4907-bcae-b20d0ebddd56\") " pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.272808 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ccff3265-0675-4907-bcae-b20d0ebddd56-frr-conf\") pod \"frr-k8s-r98m9\" (UID: \"ccff3265-0675-4907-bcae-b20d0ebddd56\") " pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.272834 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ccff3265-0675-4907-bcae-b20d0ebddd56-reloader\") pod \"frr-k8s-r98m9\" (UID: \"ccff3265-0675-4907-bcae-b20d0ebddd56\") " pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.272860 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ccff3265-0675-4907-bcae-b20d0ebddd56-metrics-certs\") pod \"frr-k8s-r98m9\" (UID: \"ccff3265-0675-4907-bcae-b20d0ebddd56\") " pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.272887 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58a00a45-b349-471d-816b-a05268da02e4-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-whtkt\" (UID: \"58a00a45-b349-471d-816b-a05268da02e4\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-whtkt" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.272912 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cngtf\" (UniqueName: \"kubernetes.io/projected/58a00a45-b349-471d-816b-a05268da02e4-kube-api-access-cngtf\") pod \"frr-k8s-webhook-server-bcc4b6f68-whtkt\" (UID: \"58a00a45-b349-471d-816b-a05268da02e4\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-whtkt" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.272931 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ccff3265-0675-4907-bcae-b20d0ebddd56-frr-startup\") pod \"frr-k8s-r98m9\" (UID: \"ccff3265-0675-4907-bcae-b20d0ebddd56\") " pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.272946 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ccff3265-0675-4907-bcae-b20d0ebddd56-frr-sockets\") pod \"frr-k8s-r98m9\" (UID: \"ccff3265-0675-4907-bcae-b20d0ebddd56\") " pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.272969 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ccff3265-0675-4907-bcae-b20d0ebddd56-metrics\") pod \"frr-k8s-r98m9\" (UID: \"ccff3265-0675-4907-bcae-b20d0ebddd56\") " pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.302772 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-pr9g2"] Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.303964 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-pr9g2" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.306132 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-544wq" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.306795 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.306849 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.306953 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.318991 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-fnvx2"] Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.320002 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-fnvx2" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.321840 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.337872 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-fnvx2"] Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.373712 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ccff3265-0675-4907-bcae-b20d0ebddd56-reloader\") pod \"frr-k8s-r98m9\" (UID: \"ccff3265-0675-4907-bcae-b20d0ebddd56\") " pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.373777 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ccff3265-0675-4907-bcae-b20d0ebddd56-metrics-certs\") pod \"frr-k8s-r98m9\" (UID: \"ccff3265-0675-4907-bcae-b20d0ebddd56\") " pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.373808 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58a00a45-b349-471d-816b-a05268da02e4-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-whtkt\" (UID: \"58a00a45-b349-471d-816b-a05268da02e4\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-whtkt" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.373832 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cngtf\" (UniqueName: \"kubernetes.io/projected/58a00a45-b349-471d-816b-a05268da02e4-kube-api-access-cngtf\") pod \"frr-k8s-webhook-server-bcc4b6f68-whtkt\" (UID: \"58a00a45-b349-471d-816b-a05268da02e4\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-whtkt" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.373854 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ccff3265-0675-4907-bcae-b20d0ebddd56-frr-startup\") pod \"frr-k8s-r98m9\" (UID: \"ccff3265-0675-4907-bcae-b20d0ebddd56\") " pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.373871 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ccff3265-0675-4907-bcae-b20d0ebddd56-frr-sockets\") pod \"frr-k8s-r98m9\" (UID: \"ccff3265-0675-4907-bcae-b20d0ebddd56\") " pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.373893 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1b08d64-c01c-4cb5-b1ee-8cfc03868c70-metrics-certs\") pod \"controller-7bb4cc7c98-fnvx2\" (UID: \"e1b08d64-c01c-4cb5-b1ee-8cfc03868c70\") " pod="metallb-system/controller-7bb4cc7c98-fnvx2" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.373912 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ccff3265-0675-4907-bcae-b20d0ebddd56-metrics\") pod \"frr-k8s-r98m9\" (UID: \"ccff3265-0675-4907-bcae-b20d0ebddd56\") " pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.373933 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1b08d64-c01c-4cb5-b1ee-8cfc03868c70-cert\") pod \"controller-7bb4cc7c98-fnvx2\" (UID: \"e1b08d64-c01c-4cb5-b1ee-8cfc03868c70\") " pod="metallb-system/controller-7bb4cc7c98-fnvx2" Mar 18 15:52:36 crc kubenswrapper[4696]: E0318 15:52:36.373951 4696 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.373970 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rthr\" (UniqueName: \"kubernetes.io/projected/e1b08d64-c01c-4cb5-b1ee-8cfc03868c70-kube-api-access-7rthr\") pod \"controller-7bb4cc7c98-fnvx2\" (UID: \"e1b08d64-c01c-4cb5-b1ee-8cfc03868c70\") " pod="metallb-system/controller-7bb4cc7c98-fnvx2" Mar 18 15:52:36 crc kubenswrapper[4696]: E0318 15:52:36.374033 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58a00a45-b349-471d-816b-a05268da02e4-cert podName:58a00a45-b349-471d-816b-a05268da02e4 nodeName:}" failed. No retries permitted until 2026-03-18 15:52:36.874012096 +0000 UTC m=+999.880186302 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/58a00a45-b349-471d-816b-a05268da02e4-cert") pod "frr-k8s-webhook-server-bcc4b6f68-whtkt" (UID: "58a00a45-b349-471d-816b-a05268da02e4") : secret "frr-k8s-webhook-server-cert" not found Mar 18 15:52:36 crc kubenswrapper[4696]: E0318 15:52:36.374040 4696 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 18 15:52:36 crc kubenswrapper[4696]: E0318 15:52:36.374125 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccff3265-0675-4907-bcae-b20d0ebddd56-metrics-certs podName:ccff3265-0675-4907-bcae-b20d0ebddd56 nodeName:}" failed. No retries permitted until 2026-03-18 15:52:36.874100759 +0000 UTC m=+999.880274965 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ccff3265-0675-4907-bcae-b20d0ebddd56-metrics-certs") pod "frr-k8s-r98m9" (UID: "ccff3265-0675-4907-bcae-b20d0ebddd56") : secret "frr-k8s-certs-secret" not found Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.374230 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nfqc\" (UniqueName: \"kubernetes.io/projected/ccff3265-0675-4907-bcae-b20d0ebddd56-kube-api-access-4nfqc\") pod \"frr-k8s-r98m9\" (UID: \"ccff3265-0675-4907-bcae-b20d0ebddd56\") " pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.374243 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ccff3265-0675-4907-bcae-b20d0ebddd56-reloader\") pod \"frr-k8s-r98m9\" (UID: \"ccff3265-0675-4907-bcae-b20d0ebddd56\") " pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.374262 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5556d2f1-1113-45f4-89d4-deea421bb0aa-memberlist\") pod \"speaker-pr9g2\" (UID: \"5556d2f1-1113-45f4-89d4-deea421bb0aa\") " pod="metallb-system/speaker-pr9g2" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.374299 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5556d2f1-1113-45f4-89d4-deea421bb0aa-metrics-certs\") pod \"speaker-pr9g2\" (UID: \"5556d2f1-1113-45f4-89d4-deea421bb0aa\") " pod="metallb-system/speaker-pr9g2" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.374295 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ccff3265-0675-4907-bcae-b20d0ebddd56-metrics\") pod \"frr-k8s-r98m9\" (UID: \"ccff3265-0675-4907-bcae-b20d0ebddd56\") " pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.374336 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5556d2f1-1113-45f4-89d4-deea421bb0aa-metallb-excludel2\") pod \"speaker-pr9g2\" (UID: \"5556d2f1-1113-45f4-89d4-deea421bb0aa\") " pod="metallb-system/speaker-pr9g2" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.374376 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz7gm\" (UniqueName: \"kubernetes.io/projected/5556d2f1-1113-45f4-89d4-deea421bb0aa-kube-api-access-qz7gm\") pod \"speaker-pr9g2\" (UID: \"5556d2f1-1113-45f4-89d4-deea421bb0aa\") " pod="metallb-system/speaker-pr9g2" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.374430 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ccff3265-0675-4907-bcae-b20d0ebddd56-frr-conf\") pod \"frr-k8s-r98m9\" (UID: \"ccff3265-0675-4907-bcae-b20d0ebddd56\") " pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.374507 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ccff3265-0675-4907-bcae-b20d0ebddd56-frr-sockets\") pod \"frr-k8s-r98m9\" (UID: \"ccff3265-0675-4907-bcae-b20d0ebddd56\") " pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.374680 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ccff3265-0675-4907-bcae-b20d0ebddd56-frr-conf\") pod \"frr-k8s-r98m9\" (UID: \"ccff3265-0675-4907-bcae-b20d0ebddd56\") " pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.374911 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ccff3265-0675-4907-bcae-b20d0ebddd56-frr-startup\") pod \"frr-k8s-r98m9\" (UID: \"ccff3265-0675-4907-bcae-b20d0ebddd56\") " pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.395411 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nfqc\" (UniqueName: \"kubernetes.io/projected/ccff3265-0675-4907-bcae-b20d0ebddd56-kube-api-access-4nfqc\") pod \"frr-k8s-r98m9\" (UID: \"ccff3265-0675-4907-bcae-b20d0ebddd56\") " pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.396035 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cngtf\" (UniqueName: \"kubernetes.io/projected/58a00a45-b349-471d-816b-a05268da02e4-kube-api-access-cngtf\") pod \"frr-k8s-webhook-server-bcc4b6f68-whtkt\" (UID: \"58a00a45-b349-471d-816b-a05268da02e4\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-whtkt" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.475248 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1b08d64-c01c-4cb5-b1ee-8cfc03868c70-metrics-certs\") pod \"controller-7bb4cc7c98-fnvx2\" (UID: \"e1b08d64-c01c-4cb5-b1ee-8cfc03868c70\") " pod="metallb-system/controller-7bb4cc7c98-fnvx2" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.475306 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1b08d64-c01c-4cb5-b1ee-8cfc03868c70-cert\") pod \"controller-7bb4cc7c98-fnvx2\" (UID: \"e1b08d64-c01c-4cb5-b1ee-8cfc03868c70\") " pod="metallb-system/controller-7bb4cc7c98-fnvx2" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.475347 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rthr\" (UniqueName: \"kubernetes.io/projected/e1b08d64-c01c-4cb5-b1ee-8cfc03868c70-kube-api-access-7rthr\") pod \"controller-7bb4cc7c98-fnvx2\" (UID: \"e1b08d64-c01c-4cb5-b1ee-8cfc03868c70\") " pod="metallb-system/controller-7bb4cc7c98-fnvx2" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.475381 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5556d2f1-1113-45f4-89d4-deea421bb0aa-memberlist\") pod \"speaker-pr9g2\" (UID: \"5556d2f1-1113-45f4-89d4-deea421bb0aa\") " pod="metallb-system/speaker-pr9g2" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.475400 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5556d2f1-1113-45f4-89d4-deea421bb0aa-metrics-certs\") pod \"speaker-pr9g2\" (UID: \"5556d2f1-1113-45f4-89d4-deea421bb0aa\") " pod="metallb-system/speaker-pr9g2" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.475434 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz7gm\" (UniqueName: \"kubernetes.io/projected/5556d2f1-1113-45f4-89d4-deea421bb0aa-kube-api-access-qz7gm\") pod \"speaker-pr9g2\" (UID: \"5556d2f1-1113-45f4-89d4-deea421bb0aa\") " pod="metallb-system/speaker-pr9g2" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.475455 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5556d2f1-1113-45f4-89d4-deea421bb0aa-metallb-excludel2\") pod \"speaker-pr9g2\" (UID: \"5556d2f1-1113-45f4-89d4-deea421bb0aa\") " pod="metallb-system/speaker-pr9g2" Mar 18 15:52:36 crc kubenswrapper[4696]: E0318 15:52:36.475917 4696 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 15:52:36 crc kubenswrapper[4696]: E0318 15:52:36.476086 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5556d2f1-1113-45f4-89d4-deea421bb0aa-memberlist podName:5556d2f1-1113-45f4-89d4-deea421bb0aa nodeName:}" failed. No retries permitted until 2026-03-18 15:52:36.97606737 +0000 UTC m=+999.982241566 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5556d2f1-1113-45f4-89d4-deea421bb0aa-memberlist") pod "speaker-pr9g2" (UID: "5556d2f1-1113-45f4-89d4-deea421bb0aa") : secret "metallb-memberlist" not found Mar 18 15:52:36 crc kubenswrapper[4696]: E0318 15:52:36.476017 4696 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 18 15:52:36 crc kubenswrapper[4696]: E0318 15:52:36.476256 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5556d2f1-1113-45f4-89d4-deea421bb0aa-metrics-certs podName:5556d2f1-1113-45f4-89d4-deea421bb0aa nodeName:}" failed. No retries permitted until 2026-03-18 15:52:36.976243345 +0000 UTC m=+999.982417551 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5556d2f1-1113-45f4-89d4-deea421bb0aa-metrics-certs") pod "speaker-pr9g2" (UID: "5556d2f1-1113-45f4-89d4-deea421bb0aa") : secret "speaker-certs-secret" not found Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.476202 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5556d2f1-1113-45f4-89d4-deea421bb0aa-metallb-excludel2\") pod \"speaker-pr9g2\" (UID: \"5556d2f1-1113-45f4-89d4-deea421bb0aa\") " pod="metallb-system/speaker-pr9g2" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.478331 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.479641 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1b08d64-c01c-4cb5-b1ee-8cfc03868c70-metrics-certs\") pod \"controller-7bb4cc7c98-fnvx2\" (UID: \"e1b08d64-c01c-4cb5-b1ee-8cfc03868c70\") " pod="metallb-system/controller-7bb4cc7c98-fnvx2" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.490043 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1b08d64-c01c-4cb5-b1ee-8cfc03868c70-cert\") pod \"controller-7bb4cc7c98-fnvx2\" (UID: \"e1b08d64-c01c-4cb5-b1ee-8cfc03868c70\") " pod="metallb-system/controller-7bb4cc7c98-fnvx2" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.493088 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rthr\" (UniqueName: \"kubernetes.io/projected/e1b08d64-c01c-4cb5-b1ee-8cfc03868c70-kube-api-access-7rthr\") pod \"controller-7bb4cc7c98-fnvx2\" (UID: \"e1b08d64-c01c-4cb5-b1ee-8cfc03868c70\") " pod="metallb-system/controller-7bb4cc7c98-fnvx2" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.494635 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz7gm\" (UniqueName: \"kubernetes.io/projected/5556d2f1-1113-45f4-89d4-deea421bb0aa-kube-api-access-qz7gm\") pod \"speaker-pr9g2\" (UID: \"5556d2f1-1113-45f4-89d4-deea421bb0aa\") " pod="metallb-system/speaker-pr9g2" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.634892 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-fnvx2" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.847650 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-fnvx2"] Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.881700 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ccff3265-0675-4907-bcae-b20d0ebddd56-metrics-certs\") pod \"frr-k8s-r98m9\" (UID: \"ccff3265-0675-4907-bcae-b20d0ebddd56\") " pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.881767 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58a00a45-b349-471d-816b-a05268da02e4-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-whtkt\" (UID: \"58a00a45-b349-471d-816b-a05268da02e4\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-whtkt" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.890205 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58a00a45-b349-471d-816b-a05268da02e4-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-whtkt\" (UID: \"58a00a45-b349-471d-816b-a05268da02e4\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-whtkt" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.890215 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ccff3265-0675-4907-bcae-b20d0ebddd56-metrics-certs\") pod \"frr-k8s-r98m9\" (UID: \"ccff3265-0675-4907-bcae-b20d0ebddd56\") " pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.983081 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5556d2f1-1113-45f4-89d4-deea421bb0aa-memberlist\") pod \"speaker-pr9g2\" (UID: \"5556d2f1-1113-45f4-89d4-deea421bb0aa\") " pod="metallb-system/speaker-pr9g2" Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.983127 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5556d2f1-1113-45f4-89d4-deea421bb0aa-metrics-certs\") pod \"speaker-pr9g2\" (UID: \"5556d2f1-1113-45f4-89d4-deea421bb0aa\") " pod="metallb-system/speaker-pr9g2" Mar 18 15:52:36 crc kubenswrapper[4696]: E0318 15:52:36.983396 4696 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 15:52:36 crc kubenswrapper[4696]: E0318 15:52:36.983498 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5556d2f1-1113-45f4-89d4-deea421bb0aa-memberlist podName:5556d2f1-1113-45f4-89d4-deea421bb0aa nodeName:}" failed. No retries permitted until 2026-03-18 15:52:37.983476187 +0000 UTC m=+1000.989650393 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5556d2f1-1113-45f4-89d4-deea421bb0aa-memberlist") pod "speaker-pr9g2" (UID: "5556d2f1-1113-45f4-89d4-deea421bb0aa") : secret "metallb-memberlist" not found Mar 18 15:52:36 crc kubenswrapper[4696]: I0318 15:52:36.988067 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5556d2f1-1113-45f4-89d4-deea421bb0aa-metrics-certs\") pod \"speaker-pr9g2\" (UID: \"5556d2f1-1113-45f4-89d4-deea421bb0aa\") " pod="metallb-system/speaker-pr9g2" Mar 18 15:52:37 crc kubenswrapper[4696]: I0318 15:52:37.123874 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-whtkt" Mar 18 15:52:37 crc kubenswrapper[4696]: I0318 15:52:37.132645 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:37 crc kubenswrapper[4696]: I0318 15:52:37.370412 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-whtkt"] Mar 18 15:52:37 crc kubenswrapper[4696]: W0318 15:52:37.377103 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58a00a45_b349_471d_816b_a05268da02e4.slice/crio-18513a02e70e53abdfa0ffdc4fb7fab8ffbfb9b76e5e501344e49828463a510d WatchSource:0}: Error finding container 18513a02e70e53abdfa0ffdc4fb7fab8ffbfb9b76e5e501344e49828463a510d: Status 404 returned error can't find the container with id 18513a02e70e53abdfa0ffdc4fb7fab8ffbfb9b76e5e501344e49828463a510d Mar 18 15:52:37 crc kubenswrapper[4696]: I0318 15:52:37.621795 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-whtkt" event={"ID":"58a00a45-b349-471d-816b-a05268da02e4","Type":"ContainerStarted","Data":"18513a02e70e53abdfa0ffdc4fb7fab8ffbfb9b76e5e501344e49828463a510d"} Mar 18 15:52:37 crc kubenswrapper[4696]: I0318 15:52:37.622732 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r98m9" event={"ID":"ccff3265-0675-4907-bcae-b20d0ebddd56","Type":"ContainerStarted","Data":"a0149b4268a3dc545fdf4e7ca36b9e8f220eecb99cf1d7decbbf940cd1b3b8f2"} Mar 18 15:52:37 crc kubenswrapper[4696]: I0318 15:52:37.624019 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-fnvx2" event={"ID":"e1b08d64-c01c-4cb5-b1ee-8cfc03868c70","Type":"ContainerStarted","Data":"236e870fd7f0a4877dbb733f4a27b2020c23ea25a9e7b4095467c47a65259bb7"} Mar 18 15:52:37 crc kubenswrapper[4696]: I0318 15:52:37.624046 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-fnvx2" event={"ID":"e1b08d64-c01c-4cb5-b1ee-8cfc03868c70","Type":"ContainerStarted","Data":"9b1ba388744afaa6896db15b72862fc3bd449674b9236aa8ab51b6225e12290f"} Mar 18 15:52:37 crc kubenswrapper[4696]: I0318 15:52:37.624067 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-fnvx2" event={"ID":"e1b08d64-c01c-4cb5-b1ee-8cfc03868c70","Type":"ContainerStarted","Data":"0d8ced84295a7046c837460b6d6762a1bc6171c65548f59c7acda143b1033575"} Mar 18 15:52:37 crc kubenswrapper[4696]: I0318 15:52:37.624915 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-fnvx2" Mar 18 15:52:37 crc kubenswrapper[4696]: I0318 15:52:37.653991 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-fnvx2" podStartSLOduration=1.653963526 podStartE2EDuration="1.653963526s" podCreationTimestamp="2026-03-18 15:52:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:52:37.652694004 +0000 UTC m=+1000.658868210" watchObservedRunningTime="2026-03-18 15:52:37.653963526 +0000 UTC m=+1000.660137732" Mar 18 15:52:37 crc kubenswrapper[4696]: I0318 15:52:37.998856 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5556d2f1-1113-45f4-89d4-deea421bb0aa-memberlist\") pod \"speaker-pr9g2\" (UID: \"5556d2f1-1113-45f4-89d4-deea421bb0aa\") " pod="metallb-system/speaker-pr9g2" Mar 18 15:52:38 crc kubenswrapper[4696]: I0318 15:52:38.008073 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5556d2f1-1113-45f4-89d4-deea421bb0aa-memberlist\") pod \"speaker-pr9g2\" (UID: \"5556d2f1-1113-45f4-89d4-deea421bb0aa\") " pod="metallb-system/speaker-pr9g2" Mar 18 15:52:38 crc kubenswrapper[4696]: I0318 15:52:38.118343 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-pr9g2" Mar 18 15:52:38 crc kubenswrapper[4696]: W0318 15:52:38.140115 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5556d2f1_1113_45f4_89d4_deea421bb0aa.slice/crio-5f20897a15da2a66d1b43852d07c20bf7230d2a7e18c0ae8efb0b23ed6b5ee9b WatchSource:0}: Error finding container 5f20897a15da2a66d1b43852d07c20bf7230d2a7e18c0ae8efb0b23ed6b5ee9b: Status 404 returned error can't find the container with id 5f20897a15da2a66d1b43852d07c20bf7230d2a7e18c0ae8efb0b23ed6b5ee9b Mar 18 15:52:38 crc kubenswrapper[4696]: I0318 15:52:38.639222 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pr9g2" event={"ID":"5556d2f1-1113-45f4-89d4-deea421bb0aa","Type":"ContainerStarted","Data":"e63c9f24b5de08812dee3191474b644246156e8f12e685b5701daed67b29b847"} Mar 18 15:52:38 crc kubenswrapper[4696]: I0318 15:52:38.639288 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pr9g2" event={"ID":"5556d2f1-1113-45f4-89d4-deea421bb0aa","Type":"ContainerStarted","Data":"1070df89dbc95d1e0da27b2651bc0068aa049f7fc9b299151e804c3cba23cf6a"} Mar 18 15:52:38 crc kubenswrapper[4696]: I0318 15:52:38.639302 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pr9g2" event={"ID":"5556d2f1-1113-45f4-89d4-deea421bb0aa","Type":"ContainerStarted","Data":"5f20897a15da2a66d1b43852d07c20bf7230d2a7e18c0ae8efb0b23ed6b5ee9b"} Mar 18 15:52:38 crc kubenswrapper[4696]: I0318 15:52:38.639574 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-pr9g2" Mar 18 15:52:42 crc kubenswrapper[4696]: I0318 15:52:42.184201 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:52:42 crc kubenswrapper[4696]: I0318 15:52:42.184668 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:52:42 crc kubenswrapper[4696]: I0318 15:52:42.184721 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 15:52:42 crc kubenswrapper[4696]: I0318 15:52:42.185294 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8dbceebb5bec41c37e5bdb742818ff7d79c094f0f97795f1ea326504fa9fafa5"} pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 15:52:42 crc kubenswrapper[4696]: I0318 15:52:42.185345 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" containerID="cri-o://8dbceebb5bec41c37e5bdb742818ff7d79c094f0f97795f1ea326504fa9fafa5" gracePeriod=600 Mar 18 15:52:42 crc kubenswrapper[4696]: I0318 15:52:42.686107 4696 generic.go:334] "Generic (PLEG): container finished" podID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerID="8dbceebb5bec41c37e5bdb742818ff7d79c094f0f97795f1ea326504fa9fafa5" exitCode=0 Mar 18 15:52:42 crc kubenswrapper[4696]: I0318 15:52:42.686169 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerDied","Data":"8dbceebb5bec41c37e5bdb742818ff7d79c094f0f97795f1ea326504fa9fafa5"} Mar 18 15:52:42 crc kubenswrapper[4696]: I0318 15:52:42.686206 4696 scope.go:117] "RemoveContainer" containerID="1b4bf50aa0e21fc64691951c3d86757dea376b50f966710c0ef2ad26f0257226" Mar 18 15:52:45 crc kubenswrapper[4696]: I0318 15:52:45.706649 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerStarted","Data":"859c999f0d34d60bd36ffb5138cc29b3e68983a12c55849818cd9411add0b7fd"} Mar 18 15:52:45 crc kubenswrapper[4696]: I0318 15:52:45.708755 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-whtkt" event={"ID":"58a00a45-b349-471d-816b-a05268da02e4","Type":"ContainerStarted","Data":"59d2b233867b091bbbba841886496b8c414060e64f87af5628cb09c4807c9abb"} Mar 18 15:52:45 crc kubenswrapper[4696]: I0318 15:52:45.709113 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-whtkt" Mar 18 15:52:45 crc kubenswrapper[4696]: I0318 15:52:45.711113 4696 generic.go:334] "Generic (PLEG): container finished" podID="ccff3265-0675-4907-bcae-b20d0ebddd56" containerID="d082d0fd8512a3227b5bdba78146c1e2ef59c66dd486ad4ef64975a47b176dd1" exitCode=0 Mar 18 15:52:45 crc kubenswrapper[4696]: I0318 15:52:45.711144 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r98m9" event={"ID":"ccff3265-0675-4907-bcae-b20d0ebddd56","Type":"ContainerDied","Data":"d082d0fd8512a3227b5bdba78146c1e2ef59c66dd486ad4ef64975a47b176dd1"} Mar 18 15:52:45 crc kubenswrapper[4696]: I0318 15:52:45.733742 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-pr9g2" podStartSLOduration=9.733724403 podStartE2EDuration="9.733724403s" podCreationTimestamp="2026-03-18 15:52:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:52:38.661068365 +0000 UTC m=+1001.667242591" watchObservedRunningTime="2026-03-18 15:52:45.733724403 +0000 UTC m=+1008.739898609" Mar 18 15:52:45 crc kubenswrapper[4696]: I0318 15:52:45.787783 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-whtkt" podStartSLOduration=2.300105442 podStartE2EDuration="9.787760496s" podCreationTimestamp="2026-03-18 15:52:36 +0000 UTC" firstStartedPulling="2026-03-18 15:52:37.379938496 +0000 UTC m=+1000.386112702" lastFinishedPulling="2026-03-18 15:52:44.86759355 +0000 UTC m=+1007.873767756" observedRunningTime="2026-03-18 15:52:45.755676857 +0000 UTC m=+1008.761851063" watchObservedRunningTime="2026-03-18 15:52:45.787760496 +0000 UTC m=+1008.793934702" Mar 18 15:52:46 crc kubenswrapper[4696]: I0318 15:52:46.718712 4696 generic.go:334] "Generic (PLEG): container finished" podID="ccff3265-0675-4907-bcae-b20d0ebddd56" containerID="fcebd68534be30b864774150359a2ada61cc8db41eccd83e85d0005caa7dd22e" exitCode=0 Mar 18 15:52:46 crc kubenswrapper[4696]: I0318 15:52:46.718776 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r98m9" event={"ID":"ccff3265-0675-4907-bcae-b20d0ebddd56","Type":"ContainerDied","Data":"fcebd68534be30b864774150359a2ada61cc8db41eccd83e85d0005caa7dd22e"} Mar 18 15:52:47 crc kubenswrapper[4696]: I0318 15:52:47.728213 4696 generic.go:334] "Generic (PLEG): container finished" podID="ccff3265-0675-4907-bcae-b20d0ebddd56" containerID="c97e4a6b1a344fdc882a0249ec9533efad4e666ebcb352222e0f413a7670c2e0" exitCode=0 Mar 18 15:52:47 crc kubenswrapper[4696]: I0318 15:52:47.728278 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r98m9" event={"ID":"ccff3265-0675-4907-bcae-b20d0ebddd56","Type":"ContainerDied","Data":"c97e4a6b1a344fdc882a0249ec9533efad4e666ebcb352222e0f413a7670c2e0"} Mar 18 15:52:48 crc kubenswrapper[4696]: I0318 15:52:48.126578 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-pr9g2" Mar 18 15:52:48 crc kubenswrapper[4696]: I0318 15:52:48.739869 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r98m9" event={"ID":"ccff3265-0675-4907-bcae-b20d0ebddd56","Type":"ContainerStarted","Data":"1c4beb9d402e9ad5a21668403ffaf765ce1e480d37f293389e3d6d17eb2f28a5"} Mar 18 15:52:48 crc kubenswrapper[4696]: I0318 15:52:48.740324 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r98m9" event={"ID":"ccff3265-0675-4907-bcae-b20d0ebddd56","Type":"ContainerStarted","Data":"a9c61d11c840254a0a364ec4adbc03a6892ab6346508f2cdf8d3cf6da7229384"} Mar 18 15:52:48 crc kubenswrapper[4696]: I0318 15:52:48.740338 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r98m9" event={"ID":"ccff3265-0675-4907-bcae-b20d0ebddd56","Type":"ContainerStarted","Data":"3fea5eddcd763c51730409f2bc41db5445e6782b4ca8c8ca15a84cd3c0bfcdee"} Mar 18 15:52:48 crc kubenswrapper[4696]: I0318 15:52:48.740348 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r98m9" event={"ID":"ccff3265-0675-4907-bcae-b20d0ebddd56","Type":"ContainerStarted","Data":"1f4bdb751a45ae8285833a0db3330f6795cc2ebef6fb3ccf3e41d08d28c67e43"} Mar 18 15:52:48 crc kubenswrapper[4696]: I0318 15:52:48.740359 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r98m9" event={"ID":"ccff3265-0675-4907-bcae-b20d0ebddd56","Type":"ContainerStarted","Data":"bbbbf5848a5f3446808b219ccffbefb1b8dbf0decf3ce7708047aafabd66dc97"} Mar 18 15:52:49 crc kubenswrapper[4696]: I0318 15:52:49.750135 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r98m9" event={"ID":"ccff3265-0675-4907-bcae-b20d0ebddd56","Type":"ContainerStarted","Data":"609304bcd804ba1e90603b7cbc57fa319ec358ac0188fbe12a47dceee9113035"} Mar 18 15:52:49 crc kubenswrapper[4696]: I0318 15:52:49.750542 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:49 crc kubenswrapper[4696]: I0318 15:52:49.779806 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-r98m9" podStartSLOduration=6.190613738 podStartE2EDuration="13.779780102s" podCreationTimestamp="2026-03-18 15:52:36 +0000 UTC" firstStartedPulling="2026-03-18 15:52:37.272859625 +0000 UTC m=+1000.279033831" lastFinishedPulling="2026-03-18 15:52:44.862025989 +0000 UTC m=+1007.868200195" observedRunningTime="2026-03-18 15:52:49.773771711 +0000 UTC m=+1012.779945917" watchObservedRunningTime="2026-03-18 15:52:49.779780102 +0000 UTC m=+1012.785954328" Mar 18 15:52:50 crc kubenswrapper[4696]: I0318 15:52:50.571243 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-262qc"] Mar 18 15:52:50 crc kubenswrapper[4696]: I0318 15:52:50.572132 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-262qc" Mar 18 15:52:50 crc kubenswrapper[4696]: I0318 15:52:50.574666 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-skt56" Mar 18 15:52:50 crc kubenswrapper[4696]: I0318 15:52:50.575070 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 18 15:52:50 crc kubenswrapper[4696]: I0318 15:52:50.582025 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 18 15:52:50 crc kubenswrapper[4696]: I0318 15:52:50.601988 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-262qc"] Mar 18 15:52:50 crc kubenswrapper[4696]: I0318 15:52:50.684369 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzr2g\" (UniqueName: \"kubernetes.io/projected/c5077ae2-6db4-444f-a5fa-ddcfbb4b6b70-kube-api-access-xzr2g\") pod \"openstack-operator-index-262qc\" (UID: \"c5077ae2-6db4-444f-a5fa-ddcfbb4b6b70\") " pod="openstack-operators/openstack-operator-index-262qc" Mar 18 15:52:50 crc kubenswrapper[4696]: I0318 15:52:50.786154 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzr2g\" (UniqueName: \"kubernetes.io/projected/c5077ae2-6db4-444f-a5fa-ddcfbb4b6b70-kube-api-access-xzr2g\") pod \"openstack-operator-index-262qc\" (UID: \"c5077ae2-6db4-444f-a5fa-ddcfbb4b6b70\") " pod="openstack-operators/openstack-operator-index-262qc" Mar 18 15:52:50 crc kubenswrapper[4696]: I0318 15:52:50.804735 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzr2g\" (UniqueName: \"kubernetes.io/projected/c5077ae2-6db4-444f-a5fa-ddcfbb4b6b70-kube-api-access-xzr2g\") pod \"openstack-operator-index-262qc\" (UID: \"c5077ae2-6db4-444f-a5fa-ddcfbb4b6b70\") " pod="openstack-operators/openstack-operator-index-262qc" Mar 18 15:52:50 crc kubenswrapper[4696]: I0318 15:52:50.899684 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-262qc" Mar 18 15:52:51 crc kubenswrapper[4696]: I0318 15:52:51.390164 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-262qc"] Mar 18 15:52:51 crc kubenswrapper[4696]: W0318 15:52:51.395173 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5077ae2_6db4_444f_a5fa_ddcfbb4b6b70.slice/crio-037d86d873e3e3f920a3d2892c4cbc17f86a778126db03fda375429090afebdf WatchSource:0}: Error finding container 037d86d873e3e3f920a3d2892c4cbc17f86a778126db03fda375429090afebdf: Status 404 returned error can't find the container with id 037d86d873e3e3f920a3d2892c4cbc17f86a778126db03fda375429090afebdf Mar 18 15:52:51 crc kubenswrapper[4696]: I0318 15:52:51.763329 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-262qc" event={"ID":"c5077ae2-6db4-444f-a5fa-ddcfbb4b6b70","Type":"ContainerStarted","Data":"037d86d873e3e3f920a3d2892c4cbc17f86a778126db03fda375429090afebdf"} Mar 18 15:52:52 crc kubenswrapper[4696]: I0318 15:52:52.133393 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:52 crc kubenswrapper[4696]: I0318 15:52:52.175244 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:53 crc kubenswrapper[4696]: I0318 15:52:53.757643 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-262qc"] Mar 18 15:52:54 crc kubenswrapper[4696]: I0318 15:52:54.360480 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-96vqk"] Mar 18 15:52:54 crc kubenswrapper[4696]: I0318 15:52:54.361479 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-96vqk" Mar 18 15:52:54 crc kubenswrapper[4696]: I0318 15:52:54.369122 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-96vqk"] Mar 18 15:52:54 crc kubenswrapper[4696]: I0318 15:52:54.451037 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc4p9\" (UniqueName: \"kubernetes.io/projected/d0469dc6-0b23-4d52-9d1a-ed5f1e2cb83d-kube-api-access-pc4p9\") pod \"openstack-operator-index-96vqk\" (UID: \"d0469dc6-0b23-4d52-9d1a-ed5f1e2cb83d\") " pod="openstack-operators/openstack-operator-index-96vqk" Mar 18 15:52:54 crc kubenswrapper[4696]: I0318 15:52:54.552876 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc4p9\" (UniqueName: \"kubernetes.io/projected/d0469dc6-0b23-4d52-9d1a-ed5f1e2cb83d-kube-api-access-pc4p9\") pod \"openstack-operator-index-96vqk\" (UID: \"d0469dc6-0b23-4d52-9d1a-ed5f1e2cb83d\") " pod="openstack-operators/openstack-operator-index-96vqk" Mar 18 15:52:54 crc kubenswrapper[4696]: I0318 15:52:54.583340 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc4p9\" (UniqueName: \"kubernetes.io/projected/d0469dc6-0b23-4d52-9d1a-ed5f1e2cb83d-kube-api-access-pc4p9\") pod \"openstack-operator-index-96vqk\" (UID: \"d0469dc6-0b23-4d52-9d1a-ed5f1e2cb83d\") " pod="openstack-operators/openstack-operator-index-96vqk" Mar 18 15:52:54 crc kubenswrapper[4696]: I0318 15:52:54.686905 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-96vqk" Mar 18 15:52:55 crc kubenswrapper[4696]: I0318 15:52:55.184978 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-96vqk"] Mar 18 15:52:55 crc kubenswrapper[4696]: I0318 15:52:55.789461 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-96vqk" event={"ID":"d0469dc6-0b23-4d52-9d1a-ed5f1e2cb83d","Type":"ContainerStarted","Data":"f38360d39d36209a6490d4ea4cdfe89b94af22d30f168740ebacae8fa908319f"} Mar 18 15:52:55 crc kubenswrapper[4696]: I0318 15:52:55.789843 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-96vqk" event={"ID":"d0469dc6-0b23-4d52-9d1a-ed5f1e2cb83d","Type":"ContainerStarted","Data":"8766ed3a0c9b77dee8256352a8a72d15d99b95fc460722912f05687f9bc44a30"} Mar 18 15:52:55 crc kubenswrapper[4696]: I0318 15:52:55.791735 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-262qc" event={"ID":"c5077ae2-6db4-444f-a5fa-ddcfbb4b6b70","Type":"ContainerStarted","Data":"3c57c4a4982fdcd1ee51d9fce4f35e910923c2c56ea74cbe0190f81a8c8dd246"} Mar 18 15:52:55 crc kubenswrapper[4696]: I0318 15:52:55.791958 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-262qc" podUID="c5077ae2-6db4-444f-a5fa-ddcfbb4b6b70" containerName="registry-server" containerID="cri-o://3c57c4a4982fdcd1ee51d9fce4f35e910923c2c56ea74cbe0190f81a8c8dd246" gracePeriod=2 Mar 18 15:52:55 crc kubenswrapper[4696]: I0318 15:52:55.803822 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-96vqk" podStartSLOduration=1.733983273 podStartE2EDuration="1.803803323s" podCreationTimestamp="2026-03-18 15:52:54 +0000 UTC" firstStartedPulling="2026-03-18 15:52:55.213920557 +0000 UTC m=+1018.220094773" lastFinishedPulling="2026-03-18 15:52:55.283740617 +0000 UTC m=+1018.289914823" observedRunningTime="2026-03-18 15:52:55.802465459 +0000 UTC m=+1018.808639665" watchObservedRunningTime="2026-03-18 15:52:55.803803323 +0000 UTC m=+1018.809977539" Mar 18 15:52:55 crc kubenswrapper[4696]: I0318 15:52:55.830512 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-262qc" podStartSLOduration=2.21714007 podStartE2EDuration="5.830496736s" podCreationTimestamp="2026-03-18 15:52:50 +0000 UTC" firstStartedPulling="2026-03-18 15:52:51.397552731 +0000 UTC m=+1014.403726937" lastFinishedPulling="2026-03-18 15:52:55.010909397 +0000 UTC m=+1018.017083603" observedRunningTime="2026-03-18 15:52:55.827045209 +0000 UTC m=+1018.833219425" watchObservedRunningTime="2026-03-18 15:52:55.830496736 +0000 UTC m=+1018.836670942" Mar 18 15:52:56 crc kubenswrapper[4696]: I0318 15:52:56.185356 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-262qc" Mar 18 15:52:56 crc kubenswrapper[4696]: I0318 15:52:56.276339 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzr2g\" (UniqueName: \"kubernetes.io/projected/c5077ae2-6db4-444f-a5fa-ddcfbb4b6b70-kube-api-access-xzr2g\") pod \"c5077ae2-6db4-444f-a5fa-ddcfbb4b6b70\" (UID: \"c5077ae2-6db4-444f-a5fa-ddcfbb4b6b70\") " Mar 18 15:52:56 crc kubenswrapper[4696]: I0318 15:52:56.288812 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5077ae2-6db4-444f-a5fa-ddcfbb4b6b70-kube-api-access-xzr2g" (OuterVolumeSpecName: "kube-api-access-xzr2g") pod "c5077ae2-6db4-444f-a5fa-ddcfbb4b6b70" (UID: "c5077ae2-6db4-444f-a5fa-ddcfbb4b6b70"). InnerVolumeSpecName "kube-api-access-xzr2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:52:56 crc kubenswrapper[4696]: I0318 15:52:56.377946 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzr2g\" (UniqueName: \"kubernetes.io/projected/c5077ae2-6db4-444f-a5fa-ddcfbb4b6b70-kube-api-access-xzr2g\") on node \"crc\" DevicePath \"\"" Mar 18 15:52:56 crc kubenswrapper[4696]: I0318 15:52:56.641619 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-fnvx2" Mar 18 15:52:56 crc kubenswrapper[4696]: I0318 15:52:56.798200 4696 generic.go:334] "Generic (PLEG): container finished" podID="c5077ae2-6db4-444f-a5fa-ddcfbb4b6b70" containerID="3c57c4a4982fdcd1ee51d9fce4f35e910923c2c56ea74cbe0190f81a8c8dd246" exitCode=0 Mar 18 15:52:56 crc kubenswrapper[4696]: I0318 15:52:56.798264 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-262qc" Mar 18 15:52:56 crc kubenswrapper[4696]: I0318 15:52:56.798300 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-262qc" event={"ID":"c5077ae2-6db4-444f-a5fa-ddcfbb4b6b70","Type":"ContainerDied","Data":"3c57c4a4982fdcd1ee51d9fce4f35e910923c2c56ea74cbe0190f81a8c8dd246"} Mar 18 15:52:56 crc kubenswrapper[4696]: I0318 15:52:56.798359 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-262qc" event={"ID":"c5077ae2-6db4-444f-a5fa-ddcfbb4b6b70","Type":"ContainerDied","Data":"037d86d873e3e3f920a3d2892c4cbc17f86a778126db03fda375429090afebdf"} Mar 18 15:52:56 crc kubenswrapper[4696]: I0318 15:52:56.798387 4696 scope.go:117] "RemoveContainer" containerID="3c57c4a4982fdcd1ee51d9fce4f35e910923c2c56ea74cbe0190f81a8c8dd246" Mar 18 15:52:56 crc kubenswrapper[4696]: I0318 15:52:56.815893 4696 scope.go:117] "RemoveContainer" containerID="3c57c4a4982fdcd1ee51d9fce4f35e910923c2c56ea74cbe0190f81a8c8dd246" Mar 18 15:52:56 crc kubenswrapper[4696]: E0318 15:52:56.816319 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c57c4a4982fdcd1ee51d9fce4f35e910923c2c56ea74cbe0190f81a8c8dd246\": container with ID starting with 3c57c4a4982fdcd1ee51d9fce4f35e910923c2c56ea74cbe0190f81a8c8dd246 not found: ID does not exist" containerID="3c57c4a4982fdcd1ee51d9fce4f35e910923c2c56ea74cbe0190f81a8c8dd246" Mar 18 15:52:56 crc kubenswrapper[4696]: I0318 15:52:56.816363 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c57c4a4982fdcd1ee51d9fce4f35e910923c2c56ea74cbe0190f81a8c8dd246"} err="failed to get container status \"3c57c4a4982fdcd1ee51d9fce4f35e910923c2c56ea74cbe0190f81a8c8dd246\": rpc error: code = NotFound desc = could not find container \"3c57c4a4982fdcd1ee51d9fce4f35e910923c2c56ea74cbe0190f81a8c8dd246\": container with ID starting with 3c57c4a4982fdcd1ee51d9fce4f35e910923c2c56ea74cbe0190f81a8c8dd246 not found: ID does not exist" Mar 18 15:52:56 crc kubenswrapper[4696]: I0318 15:52:56.836472 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-262qc"] Mar 18 15:52:56 crc kubenswrapper[4696]: I0318 15:52:56.848900 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-262qc"] Mar 18 15:52:57 crc kubenswrapper[4696]: I0318 15:52:57.130698 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-whtkt" Mar 18 15:52:57 crc kubenswrapper[4696]: I0318 15:52:57.135931 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-r98m9" Mar 18 15:52:57 crc kubenswrapper[4696]: I0318 15:52:57.606262 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5077ae2-6db4-444f-a5fa-ddcfbb4b6b70" path="/var/lib/kubelet/pods/c5077ae2-6db4-444f-a5fa-ddcfbb4b6b70/volumes" Mar 18 15:53:04 crc kubenswrapper[4696]: I0318 15:53:04.687418 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-96vqk" Mar 18 15:53:04 crc kubenswrapper[4696]: I0318 15:53:04.687770 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-96vqk" Mar 18 15:53:04 crc kubenswrapper[4696]: I0318 15:53:04.717648 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-96vqk" Mar 18 15:53:04 crc kubenswrapper[4696]: I0318 15:53:04.876358 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-96vqk" Mar 18 15:53:06 crc kubenswrapper[4696]: I0318 15:53:06.924850 4696 scope.go:117] "RemoveContainer" containerID="968e99afcac02b809d78b9e9dcd9b5d1b1f16f9506bf67c5e6c3014f292ea646" Mar 18 15:53:07 crc kubenswrapper[4696]: I0318 15:53:07.208149 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45"] Mar 18 15:53:07 crc kubenswrapper[4696]: E0318 15:53:07.209224 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5077ae2-6db4-444f-a5fa-ddcfbb4b6b70" containerName="registry-server" Mar 18 15:53:07 crc kubenswrapper[4696]: I0318 15:53:07.209256 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5077ae2-6db4-444f-a5fa-ddcfbb4b6b70" containerName="registry-server" Mar 18 15:53:07 crc kubenswrapper[4696]: I0318 15:53:07.209493 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5077ae2-6db4-444f-a5fa-ddcfbb4b6b70" containerName="registry-server" Mar 18 15:53:07 crc kubenswrapper[4696]: I0318 15:53:07.210830 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45" Mar 18 15:53:07 crc kubenswrapper[4696]: I0318 15:53:07.214038 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-wbkf8" Mar 18 15:53:07 crc kubenswrapper[4696]: I0318 15:53:07.224269 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45"] Mar 18 15:53:07 crc kubenswrapper[4696]: I0318 15:53:07.242182 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2fcb79b-86ba-4c55-babe-94a5edc318fd-util\") pod \"2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45\" (UID: \"b2fcb79b-86ba-4c55-babe-94a5edc318fd\") " pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45" Mar 18 15:53:07 crc kubenswrapper[4696]: I0318 15:53:07.242248 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2fcb79b-86ba-4c55-babe-94a5edc318fd-bundle\") pod \"2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45\" (UID: \"b2fcb79b-86ba-4c55-babe-94a5edc318fd\") " pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45" Mar 18 15:53:07 crc kubenswrapper[4696]: I0318 15:53:07.242300 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt4kv\" (UniqueName: \"kubernetes.io/projected/b2fcb79b-86ba-4c55-babe-94a5edc318fd-kube-api-access-wt4kv\") pod \"2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45\" (UID: \"b2fcb79b-86ba-4c55-babe-94a5edc318fd\") " pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45" Mar 18 15:53:07 crc kubenswrapper[4696]: I0318 15:53:07.343277 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2fcb79b-86ba-4c55-babe-94a5edc318fd-util\") pod \"2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45\" (UID: \"b2fcb79b-86ba-4c55-babe-94a5edc318fd\") " pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45" Mar 18 15:53:07 crc kubenswrapper[4696]: I0318 15:53:07.343421 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2fcb79b-86ba-4c55-babe-94a5edc318fd-bundle\") pod \"2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45\" (UID: \"b2fcb79b-86ba-4c55-babe-94a5edc318fd\") " pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45" Mar 18 15:53:07 crc kubenswrapper[4696]: I0318 15:53:07.343481 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt4kv\" (UniqueName: \"kubernetes.io/projected/b2fcb79b-86ba-4c55-babe-94a5edc318fd-kube-api-access-wt4kv\") pod \"2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45\" (UID: \"b2fcb79b-86ba-4c55-babe-94a5edc318fd\") " pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45" Mar 18 15:53:07 crc kubenswrapper[4696]: I0318 15:53:07.344447 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2fcb79b-86ba-4c55-babe-94a5edc318fd-bundle\") pod \"2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45\" (UID: \"b2fcb79b-86ba-4c55-babe-94a5edc318fd\") " pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45" Mar 18 15:53:07 crc kubenswrapper[4696]: I0318 15:53:07.344479 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2fcb79b-86ba-4c55-babe-94a5edc318fd-util\") pod \"2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45\" (UID: \"b2fcb79b-86ba-4c55-babe-94a5edc318fd\") " pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45" Mar 18 15:53:07 crc kubenswrapper[4696]: I0318 15:53:07.370177 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt4kv\" (UniqueName: \"kubernetes.io/projected/b2fcb79b-86ba-4c55-babe-94a5edc318fd-kube-api-access-wt4kv\") pod \"2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45\" (UID: \"b2fcb79b-86ba-4c55-babe-94a5edc318fd\") " pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45" Mar 18 15:53:07 crc kubenswrapper[4696]: I0318 15:53:07.549057 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45" Mar 18 15:53:08 crc kubenswrapper[4696]: I0318 15:53:08.037972 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45"] Mar 18 15:53:08 crc kubenswrapper[4696]: W0318 15:53:08.038221 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2fcb79b_86ba_4c55_babe_94a5edc318fd.slice/crio-c36e0272ec994fccaa08ca6c3045ebb5db4e8e11ff7931769a9016c89118906d WatchSource:0}: Error finding container c36e0272ec994fccaa08ca6c3045ebb5db4e8e11ff7931769a9016c89118906d: Status 404 returned error can't find the container with id c36e0272ec994fccaa08ca6c3045ebb5db4e8e11ff7931769a9016c89118906d Mar 18 15:53:08 crc kubenswrapper[4696]: I0318 15:53:08.883917 4696 generic.go:334] "Generic (PLEG): container finished" podID="b2fcb79b-86ba-4c55-babe-94a5edc318fd" containerID="6b228cdbb8274d929935c48272111df6ca4412fd08dfe8e420cf1e00ef7d14f7" exitCode=0 Mar 18 15:53:08 crc kubenswrapper[4696]: I0318 15:53:08.884027 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45" event={"ID":"b2fcb79b-86ba-4c55-babe-94a5edc318fd","Type":"ContainerDied","Data":"6b228cdbb8274d929935c48272111df6ca4412fd08dfe8e420cf1e00ef7d14f7"} Mar 18 15:53:08 crc kubenswrapper[4696]: I0318 15:53:08.884073 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45" event={"ID":"b2fcb79b-86ba-4c55-babe-94a5edc318fd","Type":"ContainerStarted","Data":"c36e0272ec994fccaa08ca6c3045ebb5db4e8e11ff7931769a9016c89118906d"} Mar 18 15:53:09 crc kubenswrapper[4696]: E0318 15:53:09.528790 4696 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2fcb79b_86ba_4c55_babe_94a5edc318fd.slice/crio-8751cc43ea19fae56b8fe31a3d3060ce8247b426795a43189b8d301fab358650.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2fcb79b_86ba_4c55_babe_94a5edc318fd.slice/crio-conmon-8751cc43ea19fae56b8fe31a3d3060ce8247b426795a43189b8d301fab358650.scope\": RecentStats: unable to find data in memory cache]" Mar 18 15:53:09 crc kubenswrapper[4696]: I0318 15:53:09.895139 4696 generic.go:334] "Generic (PLEG): container finished" podID="b2fcb79b-86ba-4c55-babe-94a5edc318fd" containerID="8751cc43ea19fae56b8fe31a3d3060ce8247b426795a43189b8d301fab358650" exitCode=0 Mar 18 15:53:09 crc kubenswrapper[4696]: I0318 15:53:09.895225 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45" event={"ID":"b2fcb79b-86ba-4c55-babe-94a5edc318fd","Type":"ContainerDied","Data":"8751cc43ea19fae56b8fe31a3d3060ce8247b426795a43189b8d301fab358650"} Mar 18 15:53:10 crc kubenswrapper[4696]: I0318 15:53:10.905218 4696 generic.go:334] "Generic (PLEG): container finished" podID="b2fcb79b-86ba-4c55-babe-94a5edc318fd" containerID="48eb3a64916e6af03cb2b91fc9f21c20d2104457971b0828358f31e0bf2fa10c" exitCode=0 Mar 18 15:53:10 crc kubenswrapper[4696]: I0318 15:53:10.905289 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45" event={"ID":"b2fcb79b-86ba-4c55-babe-94a5edc318fd","Type":"ContainerDied","Data":"48eb3a64916e6af03cb2b91fc9f21c20d2104457971b0828358f31e0bf2fa10c"} Mar 18 15:53:12 crc kubenswrapper[4696]: I0318 15:53:12.253911 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45" Mar 18 15:53:12 crc kubenswrapper[4696]: I0318 15:53:12.318111 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2fcb79b-86ba-4c55-babe-94a5edc318fd-bundle\") pod \"b2fcb79b-86ba-4c55-babe-94a5edc318fd\" (UID: \"b2fcb79b-86ba-4c55-babe-94a5edc318fd\") " Mar 18 15:53:12 crc kubenswrapper[4696]: I0318 15:53:12.318171 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt4kv\" (UniqueName: \"kubernetes.io/projected/b2fcb79b-86ba-4c55-babe-94a5edc318fd-kube-api-access-wt4kv\") pod \"b2fcb79b-86ba-4c55-babe-94a5edc318fd\" (UID: \"b2fcb79b-86ba-4c55-babe-94a5edc318fd\") " Mar 18 15:53:12 crc kubenswrapper[4696]: I0318 15:53:12.318221 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2fcb79b-86ba-4c55-babe-94a5edc318fd-util\") pod \"b2fcb79b-86ba-4c55-babe-94a5edc318fd\" (UID: \"b2fcb79b-86ba-4c55-babe-94a5edc318fd\") " Mar 18 15:53:12 crc kubenswrapper[4696]: I0318 15:53:12.319134 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2fcb79b-86ba-4c55-babe-94a5edc318fd-bundle" (OuterVolumeSpecName: "bundle") pod "b2fcb79b-86ba-4c55-babe-94a5edc318fd" (UID: "b2fcb79b-86ba-4c55-babe-94a5edc318fd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:53:12 crc kubenswrapper[4696]: I0318 15:53:12.325749 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2fcb79b-86ba-4c55-babe-94a5edc318fd-kube-api-access-wt4kv" (OuterVolumeSpecName: "kube-api-access-wt4kv") pod "b2fcb79b-86ba-4c55-babe-94a5edc318fd" (UID: "b2fcb79b-86ba-4c55-babe-94a5edc318fd"). InnerVolumeSpecName "kube-api-access-wt4kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:53:12 crc kubenswrapper[4696]: I0318 15:53:12.336310 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2fcb79b-86ba-4c55-babe-94a5edc318fd-util" (OuterVolumeSpecName: "util") pod "b2fcb79b-86ba-4c55-babe-94a5edc318fd" (UID: "b2fcb79b-86ba-4c55-babe-94a5edc318fd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:53:12 crc kubenswrapper[4696]: I0318 15:53:12.419752 4696 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2fcb79b-86ba-4c55-babe-94a5edc318fd-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:12 crc kubenswrapper[4696]: I0318 15:53:12.419786 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt4kv\" (UniqueName: \"kubernetes.io/projected/b2fcb79b-86ba-4c55-babe-94a5edc318fd-kube-api-access-wt4kv\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:12 crc kubenswrapper[4696]: I0318 15:53:12.419800 4696 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2fcb79b-86ba-4c55-babe-94a5edc318fd-util\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:12 crc kubenswrapper[4696]: I0318 15:53:12.924349 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45" event={"ID":"b2fcb79b-86ba-4c55-babe-94a5edc318fd","Type":"ContainerDied","Data":"c36e0272ec994fccaa08ca6c3045ebb5db4e8e11ff7931769a9016c89118906d"} Mar 18 15:53:12 crc kubenswrapper[4696]: I0318 15:53:12.924394 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c36e0272ec994fccaa08ca6c3045ebb5db4e8e11ff7931769a9016c89118906d" Mar 18 15:53:12 crc kubenswrapper[4696]: I0318 15:53:12.924423 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45" Mar 18 15:53:19 crc kubenswrapper[4696]: I0318 15:53:19.589272 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-7bc867c5bc-c7qv6"] Mar 18 15:53:19 crc kubenswrapper[4696]: E0318 15:53:19.593866 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2fcb79b-86ba-4c55-babe-94a5edc318fd" containerName="util" Mar 18 15:53:19 crc kubenswrapper[4696]: I0318 15:53:19.593898 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2fcb79b-86ba-4c55-babe-94a5edc318fd" containerName="util" Mar 18 15:53:19 crc kubenswrapper[4696]: E0318 15:53:19.593912 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2fcb79b-86ba-4c55-babe-94a5edc318fd" containerName="extract" Mar 18 15:53:19 crc kubenswrapper[4696]: I0318 15:53:19.593919 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2fcb79b-86ba-4c55-babe-94a5edc318fd" containerName="extract" Mar 18 15:53:19 crc kubenswrapper[4696]: E0318 15:53:19.593934 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2fcb79b-86ba-4c55-babe-94a5edc318fd" containerName="pull" Mar 18 15:53:19 crc kubenswrapper[4696]: I0318 15:53:19.593942 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2fcb79b-86ba-4c55-babe-94a5edc318fd" containerName="pull" Mar 18 15:53:19 crc kubenswrapper[4696]: I0318 15:53:19.594098 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2fcb79b-86ba-4c55-babe-94a5edc318fd" containerName="extract" Mar 18 15:53:19 crc kubenswrapper[4696]: I0318 15:53:19.594554 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7bc867c5bc-c7qv6" Mar 18 15:53:19 crc kubenswrapper[4696]: I0318 15:53:19.596980 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-pxcdj" Mar 18 15:53:19 crc kubenswrapper[4696]: I0318 15:53:19.618987 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7bc867c5bc-c7qv6"] Mar 18 15:53:19 crc kubenswrapper[4696]: I0318 15:53:19.640321 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt558\" (UniqueName: \"kubernetes.io/projected/2a8d5ad7-4bd8-4fc0-864c-9f8da421ff0e-kube-api-access-zt558\") pod \"openstack-operator-controller-init-7bc867c5bc-c7qv6\" (UID: \"2a8d5ad7-4bd8-4fc0-864c-9f8da421ff0e\") " pod="openstack-operators/openstack-operator-controller-init-7bc867c5bc-c7qv6" Mar 18 15:53:19 crc kubenswrapper[4696]: I0318 15:53:19.742287 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt558\" (UniqueName: \"kubernetes.io/projected/2a8d5ad7-4bd8-4fc0-864c-9f8da421ff0e-kube-api-access-zt558\") pod \"openstack-operator-controller-init-7bc867c5bc-c7qv6\" (UID: \"2a8d5ad7-4bd8-4fc0-864c-9f8da421ff0e\") " pod="openstack-operators/openstack-operator-controller-init-7bc867c5bc-c7qv6" Mar 18 15:53:19 crc kubenswrapper[4696]: I0318 15:53:19.774594 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt558\" (UniqueName: \"kubernetes.io/projected/2a8d5ad7-4bd8-4fc0-864c-9f8da421ff0e-kube-api-access-zt558\") pod \"openstack-operator-controller-init-7bc867c5bc-c7qv6\" (UID: \"2a8d5ad7-4bd8-4fc0-864c-9f8da421ff0e\") " pod="openstack-operators/openstack-operator-controller-init-7bc867c5bc-c7qv6" Mar 18 15:53:19 crc kubenswrapper[4696]: I0318 15:53:19.921877 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7bc867c5bc-c7qv6" Mar 18 15:53:20 crc kubenswrapper[4696]: I0318 15:53:20.373213 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7bc867c5bc-c7qv6"] Mar 18 15:53:20 crc kubenswrapper[4696]: I0318 15:53:20.982966 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7bc867c5bc-c7qv6" event={"ID":"2a8d5ad7-4bd8-4fc0-864c-9f8da421ff0e","Type":"ContainerStarted","Data":"45c078e18c7fb7cc6aeb6225a3acd3d07d6426fb711d7b485d0a76fc2aa40313"} Mar 18 15:53:26 crc kubenswrapper[4696]: I0318 15:53:26.014619 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7bc867c5bc-c7qv6" event={"ID":"2a8d5ad7-4bd8-4fc0-864c-9f8da421ff0e","Type":"ContainerStarted","Data":"277d506fd3febbc905726b22c69407a2b575f5259dc8671f8f5bd31f882cf178"} Mar 18 15:53:26 crc kubenswrapper[4696]: I0318 15:53:26.015224 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-7bc867c5bc-c7qv6" Mar 18 15:53:26 crc kubenswrapper[4696]: I0318 15:53:26.053992 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-7bc867c5bc-c7qv6" podStartSLOduration=2.571886728 podStartE2EDuration="7.053971626s" podCreationTimestamp="2026-03-18 15:53:19 +0000 UTC" firstStartedPulling="2026-03-18 15:53:20.386388083 +0000 UTC m=+1043.392562289" lastFinishedPulling="2026-03-18 15:53:24.868472981 +0000 UTC m=+1047.874647187" observedRunningTime="2026-03-18 15:53:26.047138895 +0000 UTC m=+1049.053313121" watchObservedRunningTime="2026-03-18 15:53:26.053971626 +0000 UTC m=+1049.060145832" Mar 18 15:53:39 crc kubenswrapper[4696]: I0318 15:53:39.925100 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-7bc867c5bc-c7qv6" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.547580 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cfd84c587-g2jrg"] Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.550669 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-g2jrg" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.561613 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-vft9q" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.566948 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6d77645966-7s46n"] Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.568002 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-7s46n" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.575991 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-wf25t" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.577087 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cfd84c587-g2jrg"] Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.588013 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6d77645966-7s46n"] Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.616169 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6cc65c69fc-r4qqr"] Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.617172 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-r4qqr" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.624719 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-66gbb" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.630451 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6cc65c69fc-r4qqr"] Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.642555 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d559dcdbd-tpb84"] Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.643685 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-tpb84" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.646145 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-kc5ln" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.665893 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d559dcdbd-tpb84"] Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.686292 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-66dd9d474d-dfclz"] Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.687260 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-dfclz" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.687897 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-64dc66d669-kfqqk"] Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.688963 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-kfqqk" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.697919 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-xhvm4" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.698534 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-xfm4k" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.707734 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm8pg\" (UniqueName: \"kubernetes.io/projected/789669f2-e26b-4de8-ad21-801820b5806b-kube-api-access-dm8pg\") pod \"cinder-operator-controller-manager-6d77645966-7s46n\" (UID: \"789669f2-e26b-4de8-ad21-801820b5806b\") " pod="openstack-operators/cinder-operator-controller-manager-6d77645966-7s46n" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.707784 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6kxw\" (UniqueName: \"kubernetes.io/projected/00c9dc5c-b65b-4f58-8ef8-06ebc3ec0aac-kube-api-access-j6kxw\") pod \"designate-operator-controller-manager-6cc65c69fc-r4qqr\" (UID: \"00c9dc5c-b65b-4f58-8ef8-06ebc3ec0aac\") " pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-r4qqr" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.707833 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8rtt\" (UniqueName: \"kubernetes.io/projected/ef87c345-1284-41dd-a5ae-57ae08c9558e-kube-api-access-d8rtt\") pod \"barbican-operator-controller-manager-5cfd84c587-g2jrg\" (UID: \"ef87c345-1284-41dd-a5ae-57ae08c9558e\") " pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-g2jrg" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.727735 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-66dd9d474d-dfclz"] Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.736556 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5595c7d6ff-gggxc"] Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.737707 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-gggxc" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.744853 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-64dc66d669-kfqqk"] Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.745879 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-nlb2z" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.756787 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.769120 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6b77b7676d-5nkd5"] Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.770388 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-5nkd5" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.782010 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-mxms7" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.810460 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnpxh\" (UniqueName: \"kubernetes.io/projected/78640d4c-766f-4fd8-ab5f-54687b6fb5c6-kube-api-access-tnpxh\") pod \"horizon-operator-controller-manager-64dc66d669-kfqqk\" (UID: \"78640d4c-766f-4fd8-ab5f-54687b6fb5c6\") " pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-kfqqk" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.813724 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzs8t\" (UniqueName: \"kubernetes.io/projected/123177a5-da82-4485-990a-d5ced4dbf8ca-kube-api-access-hzs8t\") pod \"glance-operator-controller-manager-7d559dcdbd-tpb84\" (UID: \"123177a5-da82-4485-990a-d5ced4dbf8ca\") " pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-tpb84" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.818155 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krxls\" (UniqueName: \"kubernetes.io/projected/1edbcf2a-3a8d-4d2a-8c28-8fe7b66bd7be-kube-api-access-krxls\") pod \"heat-operator-controller-manager-66dd9d474d-dfclz\" (UID: \"1edbcf2a-3a8d-4d2a-8c28-8fe7b66bd7be\") " pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-dfclz" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.818347 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8rtt\" (UniqueName: \"kubernetes.io/projected/ef87c345-1284-41dd-a5ae-57ae08c9558e-kube-api-access-d8rtt\") pod \"barbican-operator-controller-manager-5cfd84c587-g2jrg\" (UID: \"ef87c345-1284-41dd-a5ae-57ae08c9558e\") " pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-g2jrg" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.818612 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm8pg\" (UniqueName: \"kubernetes.io/projected/789669f2-e26b-4de8-ad21-801820b5806b-kube-api-access-dm8pg\") pod \"cinder-operator-controller-manager-6d77645966-7s46n\" (UID: \"789669f2-e26b-4de8-ad21-801820b5806b\") " pod="openstack-operators/cinder-operator-controller-manager-6d77645966-7s46n" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.820343 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6kxw\" (UniqueName: \"kubernetes.io/projected/00c9dc5c-b65b-4f58-8ef8-06ebc3ec0aac-kube-api-access-j6kxw\") pod \"designate-operator-controller-manager-6cc65c69fc-r4qqr\" (UID: \"00c9dc5c-b65b-4f58-8ef8-06ebc3ec0aac\") " pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-r4qqr" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.841631 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-76b87776c9-5s8hj"] Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.842716 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-5s8hj" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.854637 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-xhk6l" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.858885 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6kxw\" (UniqueName: \"kubernetes.io/projected/00c9dc5c-b65b-4f58-8ef8-06ebc3ec0aac-kube-api-access-j6kxw\") pod \"designate-operator-controller-manager-6cc65c69fc-r4qqr\" (UID: \"00c9dc5c-b65b-4f58-8ef8-06ebc3ec0aac\") " pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-r4qqr" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.867213 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8rtt\" (UniqueName: \"kubernetes.io/projected/ef87c345-1284-41dd-a5ae-57ae08c9558e-kube-api-access-d8rtt\") pod \"barbican-operator-controller-manager-5cfd84c587-g2jrg\" (UID: \"ef87c345-1284-41dd-a5ae-57ae08c9558e\") " pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-g2jrg" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.867336 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm8pg\" (UniqueName: \"kubernetes.io/projected/789669f2-e26b-4de8-ad21-801820b5806b-kube-api-access-dm8pg\") pod \"cinder-operator-controller-manager-6d77645966-7s46n\" (UID: \"789669f2-e26b-4de8-ad21-801820b5806b\") " pod="openstack-operators/cinder-operator-controller-manager-6d77645966-7s46n" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.889478 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-g2jrg" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.900243 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5595c7d6ff-gggxc"] Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.921192 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnpxh\" (UniqueName: \"kubernetes.io/projected/78640d4c-766f-4fd8-ab5f-54687b6fb5c6-kube-api-access-tnpxh\") pod \"horizon-operator-controller-manager-64dc66d669-kfqqk\" (UID: \"78640d4c-766f-4fd8-ab5f-54687b6fb5c6\") " pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-kfqqk" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.921237 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzs8t\" (UniqueName: \"kubernetes.io/projected/123177a5-da82-4485-990a-d5ced4dbf8ca-kube-api-access-hzs8t\") pod \"glance-operator-controller-manager-7d559dcdbd-tpb84\" (UID: \"123177a5-da82-4485-990a-d5ced4dbf8ca\") " pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-tpb84" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.921284 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krxls\" (UniqueName: \"kubernetes.io/projected/1edbcf2a-3a8d-4d2a-8c28-8fe7b66bd7be-kube-api-access-krxls\") pod \"heat-operator-controller-manager-66dd9d474d-dfclz\" (UID: \"1edbcf2a-3a8d-4d2a-8c28-8fe7b66bd7be\") " pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-dfclz" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.921307 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp9fh\" (UniqueName: \"kubernetes.io/projected/e5ef6f08-4538-435c-b5c8-42bac561d200-kube-api-access-gp9fh\") pod \"infra-operator-controller-manager-5595c7d6ff-gggxc\" (UID: \"e5ef6f08-4538-435c-b5c8-42bac561d200\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-gggxc" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.921335 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5ef6f08-4538-435c-b5c8-42bac561d200-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-gggxc\" (UID: \"e5ef6f08-4538-435c-b5c8-42bac561d200\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-gggxc" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.921353 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gjkt\" (UniqueName: \"kubernetes.io/projected/e915aebf-c140-44ee-90b8-ce169df57fd9-kube-api-access-6gjkt\") pod \"ironic-operator-controller-manager-6b77b7676d-5nkd5\" (UID: \"e915aebf-c140-44ee-90b8-ce169df57fd9\") " pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-5nkd5" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.938254 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-76b87776c9-5s8hj"] Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.952438 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-7s46n" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.953319 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-r4qqr" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.956454 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-fbf7bbb96-v85hd"] Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.957555 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v85hd" Mar 18 15:53:59 crc kubenswrapper[4696]: I0318 15:53:59.980142 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-q9fh7" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.000003 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krxls\" (UniqueName: \"kubernetes.io/projected/1edbcf2a-3a8d-4d2a-8c28-8fe7b66bd7be-kube-api-access-krxls\") pod \"heat-operator-controller-manager-66dd9d474d-dfclz\" (UID: \"1edbcf2a-3a8d-4d2a-8c28-8fe7b66bd7be\") " pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-dfclz" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.006557 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-fbf7bbb96-v85hd"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.009961 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-dfclz" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.013158 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzs8t\" (UniqueName: \"kubernetes.io/projected/123177a5-da82-4485-990a-d5ced4dbf8ca-kube-api-access-hzs8t\") pod \"glance-operator-controller-manager-7d559dcdbd-tpb84\" (UID: \"123177a5-da82-4485-990a-d5ced4dbf8ca\") " pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-tpb84" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.013994 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnpxh\" (UniqueName: \"kubernetes.io/projected/78640d4c-766f-4fd8-ab5f-54687b6fb5c6-kube-api-access-tnpxh\") pod \"horizon-operator-controller-manager-64dc66d669-kfqqk\" (UID: \"78640d4c-766f-4fd8-ab5f-54687b6fb5c6\") " pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-kfqqk" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.017221 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-kfqqk" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.027289 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gjkt\" (UniqueName: \"kubernetes.io/projected/e915aebf-c140-44ee-90b8-ce169df57fd9-kube-api-access-6gjkt\") pod \"ironic-operator-controller-manager-6b77b7676d-5nkd5\" (UID: \"e915aebf-c140-44ee-90b8-ce169df57fd9\") " pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-5nkd5" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.027422 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxckn\" (UniqueName: \"kubernetes.io/projected/61d78ab1-c6d3-4dfa-a630-5ccddfd04a0f-kube-api-access-bxckn\") pod \"keystone-operator-controller-manager-76b87776c9-5s8hj\" (UID: \"61d78ab1-c6d3-4dfa-a630-5ccddfd04a0f\") " pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-5s8hj" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.027452 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp9fh\" (UniqueName: \"kubernetes.io/projected/e5ef6f08-4538-435c-b5c8-42bac561d200-kube-api-access-gp9fh\") pod \"infra-operator-controller-manager-5595c7d6ff-gggxc\" (UID: \"e5ef6f08-4538-435c-b5c8-42bac561d200\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-gggxc" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.027482 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5ef6f08-4538-435c-b5c8-42bac561d200-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-gggxc\" (UID: \"e5ef6f08-4538-435c-b5c8-42bac561d200\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-gggxc" Mar 18 15:54:00 crc kubenswrapper[4696]: E0318 15:54:00.027657 4696 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 15:54:00 crc kubenswrapper[4696]: E0318 15:54:00.027721 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5ef6f08-4538-435c-b5c8-42bac561d200-cert podName:e5ef6f08-4538-435c-b5c8-42bac561d200 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:00.527700565 +0000 UTC m=+1083.533874771 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e5ef6f08-4538-435c-b5c8-42bac561d200-cert") pod "infra-operator-controller-manager-5595c7d6ff-gggxc" (UID: "e5ef6f08-4538-435c-b5c8-42bac561d200") : secret "infra-operator-webhook-server-cert" not found Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.028409 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6b77b7676d-5nkd5"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.028465 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-gm92k"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.029288 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-gm92k" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.035304 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-bc5c78db9-blz69"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.036631 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-blz69" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.041767 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6744dd545c-crpf5"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.045662 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-gm92k"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.045800 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-crpf5" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.048720 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-bc5c78db9-blz69"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.059531 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6744dd545c-crpf5"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.063104 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-89j8q" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.063321 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-hhvp8" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.063432 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-mncnh" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.098377 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gjkt\" (UniqueName: \"kubernetes.io/projected/e915aebf-c140-44ee-90b8-ce169df57fd9-kube-api-access-6gjkt\") pod \"ironic-operator-controller-manager-6b77b7676d-5nkd5\" (UID: \"e915aebf-c140-44ee-90b8-ce169df57fd9\") " pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-5nkd5" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.104973 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-5nkd5" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.110690 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-56f74467c6-z87fb"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.111860 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-z87fb" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.122852 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp9fh\" (UniqueName: \"kubernetes.io/projected/e5ef6f08-4538-435c-b5c8-42bac561d200-kube-api-access-gp9fh\") pod \"infra-operator-controller-manager-5595c7d6ff-gggxc\" (UID: \"e5ef6f08-4538-435c-b5c8-42bac561d200\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-gggxc" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.138295 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-sjc5m" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.139286 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swstm\" (UniqueName: \"kubernetes.io/projected/411ef48e-d8ac-471f-9018-ee5fd534a4c9-kube-api-access-swstm\") pod \"mariadb-operator-controller-manager-6f5b7bcd4-gm92k\" (UID: \"411ef48e-d8ac-471f-9018-ee5fd534a4c9\") " pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-gm92k" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.139342 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cklfb\" (UniqueName: \"kubernetes.io/projected/e7082f0a-1b24-4fda-b9b2-eb957c569232-kube-api-access-cklfb\") pod \"nova-operator-controller-manager-bc5c78db9-blz69\" (UID: \"e7082f0a-1b24-4fda-b9b2-eb957c569232\") " pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-blz69" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.139384 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxckn\" (UniqueName: \"kubernetes.io/projected/61d78ab1-c6d3-4dfa-a630-5ccddfd04a0f-kube-api-access-bxckn\") pod \"keystone-operator-controller-manager-76b87776c9-5s8hj\" (UID: \"61d78ab1-c6d3-4dfa-a630-5ccddfd04a0f\") " pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-5s8hj" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.139441 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rglj\" (UniqueName: \"kubernetes.io/projected/13174b57-caf5-46f2-8605-51e4de880253-kube-api-access-2rglj\") pod \"manila-operator-controller-manager-fbf7bbb96-v85hd\" (UID: \"13174b57-caf5-46f2-8605-51e4de880253\") " pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v85hd" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.189305 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-56f74467c6-z87fb"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.208260 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxckn\" (UniqueName: \"kubernetes.io/projected/61d78ab1-c6d3-4dfa-a630-5ccddfd04a0f-kube-api-access-bxckn\") pod \"keystone-operator-controller-manager-76b87776c9-5s8hj\" (UID: \"61d78ab1-c6d3-4dfa-a630-5ccddfd04a0f\") " pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-5s8hj" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.245412 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rglj\" (UniqueName: \"kubernetes.io/projected/13174b57-caf5-46f2-8605-51e4de880253-kube-api-access-2rglj\") pod \"manila-operator-controller-manager-fbf7bbb96-v85hd\" (UID: \"13174b57-caf5-46f2-8605-51e4de880253\") " pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v85hd" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.245468 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zflsn\" (UniqueName: \"kubernetes.io/projected/e74d1820-3e14-431f-866b-b0ab8b97f20f-kube-api-access-zflsn\") pod \"neutron-operator-controller-manager-6744dd545c-crpf5\" (UID: \"e74d1820-3e14-431f-866b-b0ab8b97f20f\") " pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-crpf5" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.245513 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgd9s\" (UniqueName: \"kubernetes.io/projected/afdee753-15ca-42fe-8cc1-937b42d07b85-kube-api-access-lgd9s\") pod \"octavia-operator-controller-manager-56f74467c6-z87fb\" (UID: \"afdee753-15ca-42fe-8cc1-937b42d07b85\") " pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-z87fb" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.245565 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swstm\" (UniqueName: \"kubernetes.io/projected/411ef48e-d8ac-471f-9018-ee5fd534a4c9-kube-api-access-swstm\") pod \"mariadb-operator-controller-manager-6f5b7bcd4-gm92k\" (UID: \"411ef48e-d8ac-471f-9018-ee5fd534a4c9\") " pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-gm92k" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.245615 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cklfb\" (UniqueName: \"kubernetes.io/projected/e7082f0a-1b24-4fda-b9b2-eb957c569232-kube-api-access-cklfb\") pod \"nova-operator-controller-manager-bc5c78db9-blz69\" (UID: \"e7082f0a-1b24-4fda-b9b2-eb957c569232\") " pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-blz69" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.285646 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-tpb84" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.286226 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-5s8hj" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.372423 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swstm\" (UniqueName: \"kubernetes.io/projected/411ef48e-d8ac-471f-9018-ee5fd534a4c9-kube-api-access-swstm\") pod \"mariadb-operator-controller-manager-6f5b7bcd4-gm92k\" (UID: \"411ef48e-d8ac-471f-9018-ee5fd534a4c9\") " pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-gm92k" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.374748 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zflsn\" (UniqueName: \"kubernetes.io/projected/e74d1820-3e14-431f-866b-b0ab8b97f20f-kube-api-access-zflsn\") pod \"neutron-operator-controller-manager-6744dd545c-crpf5\" (UID: \"e74d1820-3e14-431f-866b-b0ab8b97f20f\") " pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-crpf5" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.375112 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cklfb\" (UniqueName: \"kubernetes.io/projected/e7082f0a-1b24-4fda-b9b2-eb957c569232-kube-api-access-cklfb\") pod \"nova-operator-controller-manager-bc5c78db9-blz69\" (UID: \"e7082f0a-1b24-4fda-b9b2-eb957c569232\") " pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-blz69" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.375682 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgd9s\" (UniqueName: \"kubernetes.io/projected/afdee753-15ca-42fe-8cc1-937b42d07b85-kube-api-access-lgd9s\") pod \"octavia-operator-controller-manager-56f74467c6-z87fb\" (UID: \"afdee753-15ca-42fe-8cc1-937b42d07b85\") " pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-z87fb" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.389778 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564154-pd7vl"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.396676 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-gm92k" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.403239 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564154-pd7vl" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.413702 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-blz69" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.415209 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.415451 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.425257 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.437739 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rglj\" (UniqueName: \"kubernetes.io/projected/13174b57-caf5-46f2-8605-51e4de880253-kube-api-access-2rglj\") pod \"manila-operator-controller-manager-fbf7bbb96-v85hd\" (UID: \"13174b57-caf5-46f2-8605-51e4de880253\") " pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v85hd" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.453012 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgd9s\" (UniqueName: \"kubernetes.io/projected/afdee753-15ca-42fe-8cc1-937b42d07b85-kube-api-access-lgd9s\") pod \"octavia-operator-controller-manager-56f74467c6-z87fb\" (UID: \"afdee753-15ca-42fe-8cc1-937b42d07b85\") " pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-z87fb" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.459476 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.459830 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zflsn\" (UniqueName: \"kubernetes.io/projected/e74d1820-3e14-431f-866b-b0ab8b97f20f-kube-api-access-zflsn\") pod \"neutron-operator-controller-manager-6744dd545c-crpf5\" (UID: \"e74d1820-3e14-431f-866b-b0ab8b97f20f\") " pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-crpf5" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.460507 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.477700 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.477942 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-6pjrz" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.478281 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564154-pd7vl"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.479151 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-z87fb" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.504023 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-846c4cdcb7-bn6ct"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.504951 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-bn6ct" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.516545 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-nnc9d" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.562462 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-659fb58c6b-sbh54"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.563428 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-sbh54" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.567548 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-cw6br" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.578727 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9jhq\" (UniqueName: \"kubernetes.io/projected/0eacab42-0fe3-4d23-b00c-81353faa98f8-kube-api-access-z9jhq\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj\" (UID: \"0eacab42-0fe3-4d23-b00c-81353faa98f8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.607457 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0eacab42-0fe3-4d23-b00c-81353faa98f8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj\" (UID: \"0eacab42-0fe3-4d23-b00c-81353faa98f8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.607641 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jjnm\" (UniqueName: \"kubernetes.io/projected/dd6bdd44-a607-4081-a732-f572001c79af-kube-api-access-5jjnm\") pod \"auto-csr-approver-29564154-pd7vl\" (UID: \"dd6bdd44-a607-4081-a732-f572001c79af\") " pod="openshift-infra/auto-csr-approver-29564154-pd7vl" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.607745 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5ef6f08-4538-435c-b5c8-42bac561d200-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-gggxc\" (UID: \"e5ef6f08-4538-435c-b5c8-42bac561d200\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-gggxc" Mar 18 15:54:00 crc kubenswrapper[4696]: E0318 15:54:00.607875 4696 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 15:54:00 crc kubenswrapper[4696]: E0318 15:54:00.607927 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5ef6f08-4538-435c-b5c8-42bac561d200-cert podName:e5ef6f08-4538-435c-b5c8-42bac561d200 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:01.607912574 +0000 UTC m=+1084.614086780 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e5ef6f08-4538-435c-b5c8-42bac561d200-cert") pod "infra-operator-controller-manager-5595c7d6ff-gggxc" (UID: "e5ef6f08-4538-435c-b5c8-42bac561d200") : secret "infra-operator-webhook-server-cert" not found Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.595092 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.617630 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-846c4cdcb7-bn6ct"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.619022 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v85hd" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.642700 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-659fb58c6b-sbh54"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.668751 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-867f54bc44-wf58k"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.672853 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-wf58k" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.705230 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-rp7cw" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.711934 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgc7m\" (UniqueName: \"kubernetes.io/projected/48afb8e4-ed3d-4c76-9be0-15279dda8889-kube-api-access-kgc7m\") pod \"placement-operator-controller-manager-659fb58c6b-sbh54\" (UID: \"48afb8e4-ed3d-4c76-9be0-15279dda8889\") " pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-sbh54" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.712334 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jjnm\" (UniqueName: \"kubernetes.io/projected/dd6bdd44-a607-4081-a732-f572001c79af-kube-api-access-5jjnm\") pod \"auto-csr-approver-29564154-pd7vl\" (UID: \"dd6bdd44-a607-4081-a732-f572001c79af\") " pod="openshift-infra/auto-csr-approver-29564154-pd7vl" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.712458 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9xkd\" (UniqueName: \"kubernetes.io/projected/9597433a-1cf7-4455-8aa6-8709fef284dd-kube-api-access-g9xkd\") pod \"ovn-operator-controller-manager-846c4cdcb7-bn6ct\" (UID: \"9597433a-1cf7-4455-8aa6-8709fef284dd\") " pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-bn6ct" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.712572 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9jhq\" (UniqueName: \"kubernetes.io/projected/0eacab42-0fe3-4d23-b00c-81353faa98f8-kube-api-access-z9jhq\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj\" (UID: \"0eacab42-0fe3-4d23-b00c-81353faa98f8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.712657 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0eacab42-0fe3-4d23-b00c-81353faa98f8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj\" (UID: \"0eacab42-0fe3-4d23-b00c-81353faa98f8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.712729 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srzgl\" (UniqueName: \"kubernetes.io/projected/288ae45d-6c8b-4034-8e4d-e2af975bda6f-kube-api-access-srzgl\") pod \"swift-operator-controller-manager-867f54bc44-wf58k\" (UID: \"288ae45d-6c8b-4034-8e4d-e2af975bda6f\") " pod="openstack-operators/swift-operator-controller-manager-867f54bc44-wf58k" Mar 18 15:54:00 crc kubenswrapper[4696]: E0318 15:54:00.714471 4696 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:54:00 crc kubenswrapper[4696]: E0318 15:54:00.714620 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0eacab42-0fe3-4d23-b00c-81353faa98f8-cert podName:0eacab42-0fe3-4d23-b00c-81353faa98f8 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:01.214605952 +0000 UTC m=+1084.220780158 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0eacab42-0fe3-4d23-b00c-81353faa98f8-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" (UID: "0eacab42-0fe3-4d23-b00c-81353faa98f8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.728564 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d84559f47-x7vwf"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.729642 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-x7vwf" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.734530 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-fx98d" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.739503 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9jhq\" (UniqueName: \"kubernetes.io/projected/0eacab42-0fe3-4d23-b00c-81353faa98f8-kube-api-access-z9jhq\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj\" (UID: \"0eacab42-0fe3-4d23-b00c-81353faa98f8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.757566 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-crpf5" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.787110 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jjnm\" (UniqueName: \"kubernetes.io/projected/dd6bdd44-a607-4081-a732-f572001c79af-kube-api-access-5jjnm\") pod \"auto-csr-approver-29564154-pd7vl\" (UID: \"dd6bdd44-a607-4081-a732-f572001c79af\") " pod="openshift-infra/auto-csr-approver-29564154-pd7vl" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.794009 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-867f54bc44-wf58k"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.815512 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9xkd\" (UniqueName: \"kubernetes.io/projected/9597433a-1cf7-4455-8aa6-8709fef284dd-kube-api-access-g9xkd\") pod \"ovn-operator-controller-manager-846c4cdcb7-bn6ct\" (UID: \"9597433a-1cf7-4455-8aa6-8709fef284dd\") " pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-bn6ct" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.815648 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srzgl\" (UniqueName: \"kubernetes.io/projected/288ae45d-6c8b-4034-8e4d-e2af975bda6f-kube-api-access-srzgl\") pod \"swift-operator-controller-manager-867f54bc44-wf58k\" (UID: \"288ae45d-6c8b-4034-8e4d-e2af975bda6f\") " pod="openstack-operators/swift-operator-controller-manager-867f54bc44-wf58k" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.815708 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgc7m\" (UniqueName: \"kubernetes.io/projected/48afb8e4-ed3d-4c76-9be0-15279dda8889-kube-api-access-kgc7m\") pod \"placement-operator-controller-manager-659fb58c6b-sbh54\" (UID: \"48afb8e4-ed3d-4c76-9be0-15279dda8889\") " pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-sbh54" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.816273 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-p7snf"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.817357 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-p7snf" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.828037 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-gx7gz" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.832263 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d84559f47-x7vwf"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.847955 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564154-pd7vl" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.858711 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srzgl\" (UniqueName: \"kubernetes.io/projected/288ae45d-6c8b-4034-8e4d-e2af975bda6f-kube-api-access-srzgl\") pod \"swift-operator-controller-manager-867f54bc44-wf58k\" (UID: \"288ae45d-6c8b-4034-8e4d-e2af975bda6f\") " pod="openstack-operators/swift-operator-controller-manager-867f54bc44-wf58k" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.866658 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9xkd\" (UniqueName: \"kubernetes.io/projected/9597433a-1cf7-4455-8aa6-8709fef284dd-kube-api-access-g9xkd\") pod \"ovn-operator-controller-manager-846c4cdcb7-bn6ct\" (UID: \"9597433a-1cf7-4455-8aa6-8709fef284dd\") " pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-bn6ct" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.872829 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgc7m\" (UniqueName: \"kubernetes.io/projected/48afb8e4-ed3d-4c76-9be0-15279dda8889-kube-api-access-kgc7m\") pod \"placement-operator-controller-manager-659fb58c6b-sbh54\" (UID: \"48afb8e4-ed3d-4c76-9be0-15279dda8889\") " pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-sbh54" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.885679 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-74d6f7b5c-8hndt"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.886872 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-8hndt" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.905834 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-7sbdr" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.918268 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r69kg\" (UniqueName: \"kubernetes.io/projected/a08cb64d-e133-4787-956b-4cef003ea78a-kube-api-access-r69kg\") pod \"telemetry-operator-controller-manager-6d84559f47-x7vwf\" (UID: \"a08cb64d-e133-4787-956b-4cef003ea78a\") " pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-x7vwf" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.918354 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5zv7\" (UniqueName: \"kubernetes.io/projected/06cdd947-c4dd-4ccf-bb4b-fffef57443d4-kube-api-access-q5zv7\") pod \"test-operator-controller-manager-8467ccb4c8-p7snf\" (UID: \"06cdd947-c4dd-4ccf-bb4b-fffef57443d4\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-p7snf" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.918377 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn6n5\" (UniqueName: \"kubernetes.io/projected/fa515a71-3c55-46b7-bab2-60cef0a2b2e1-kube-api-access-dn6n5\") pod \"watcher-operator-controller-manager-74d6f7b5c-8hndt\" (UID: \"fa515a71-3c55-46b7-bab2-60cef0a2b2e1\") " pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-8hndt" Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.919283 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-p7snf"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.952977 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-74d6f7b5c-8hndt"] Mar 18 15:54:00 crc kubenswrapper[4696]: I0318 15:54:00.963253 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-bn6ct" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:00.999461 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk"] Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.001084 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.004027 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-bw9kf" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.004241 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.004439 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.021492 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r69kg\" (UniqueName: \"kubernetes.io/projected/a08cb64d-e133-4787-956b-4cef003ea78a-kube-api-access-r69kg\") pod \"telemetry-operator-controller-manager-6d84559f47-x7vwf\" (UID: \"a08cb64d-e133-4787-956b-4cef003ea78a\") " pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-x7vwf" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.021656 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5zv7\" (UniqueName: \"kubernetes.io/projected/06cdd947-c4dd-4ccf-bb4b-fffef57443d4-kube-api-access-q5zv7\") pod \"test-operator-controller-manager-8467ccb4c8-p7snf\" (UID: \"06cdd947-c4dd-4ccf-bb4b-fffef57443d4\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-p7snf" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.021700 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn6n5\" (UniqueName: \"kubernetes.io/projected/fa515a71-3c55-46b7-bab2-60cef0a2b2e1-kube-api-access-dn6n5\") pod \"watcher-operator-controller-manager-74d6f7b5c-8hndt\" (UID: \"fa515a71-3c55-46b7-bab2-60cef0a2b2e1\") " pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-8hndt" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.034881 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-sbh54" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.042745 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk"] Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.054005 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5zv7\" (UniqueName: \"kubernetes.io/projected/06cdd947-c4dd-4ccf-bb4b-fffef57443d4-kube-api-access-q5zv7\") pod \"test-operator-controller-manager-8467ccb4c8-p7snf\" (UID: \"06cdd947-c4dd-4ccf-bb4b-fffef57443d4\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-p7snf" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.055605 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r69kg\" (UniqueName: \"kubernetes.io/projected/a08cb64d-e133-4787-956b-4cef003ea78a-kube-api-access-r69kg\") pod \"telemetry-operator-controller-manager-6d84559f47-x7vwf\" (UID: \"a08cb64d-e133-4787-956b-4cef003ea78a\") " pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-x7vwf" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.060193 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn6n5\" (UniqueName: \"kubernetes.io/projected/fa515a71-3c55-46b7-bab2-60cef0a2b2e1-kube-api-access-dn6n5\") pod \"watcher-operator-controller-manager-74d6f7b5c-8hndt\" (UID: \"fa515a71-3c55-46b7-bab2-60cef0a2b2e1\") " pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-8hndt" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.075194 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9rttk"] Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.076269 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9rttk" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.076938 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-wf58k" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.079062 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-68bgr" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.094722 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9rttk"] Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.111596 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6cc65c69fc-r4qqr"] Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.115064 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-x7vwf" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.123478 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-webhook-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-njrtk\" (UID: \"caa2772a-b8a8-4d65-8b8d-19d9c03c62d6\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.123549 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-metrics-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-njrtk\" (UID: \"caa2772a-b8a8-4d65-8b8d-19d9c03c62d6\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.123614 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgww9\" (UniqueName: \"kubernetes.io/projected/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-kube-api-access-vgww9\") pod \"openstack-operator-controller-manager-65fbdb4fdd-njrtk\" (UID: \"caa2772a-b8a8-4d65-8b8d-19d9c03c62d6\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.207112 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-p7snf" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.224961 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-webhook-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-njrtk\" (UID: \"caa2772a-b8a8-4d65-8b8d-19d9c03c62d6\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.225016 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-metrics-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-njrtk\" (UID: \"caa2772a-b8a8-4d65-8b8d-19d9c03c62d6\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.225042 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dswfm\" (UniqueName: \"kubernetes.io/projected/b0eb1bc0-9e8c-4836-b7e4-32be5e48bed4-kube-api-access-dswfm\") pod \"rabbitmq-cluster-operator-manager-668c99d594-9rttk\" (UID: \"b0eb1bc0-9e8c-4836-b7e4-32be5e48bed4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9rttk" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.225081 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgww9\" (UniqueName: \"kubernetes.io/projected/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-kube-api-access-vgww9\") pod \"openstack-operator-controller-manager-65fbdb4fdd-njrtk\" (UID: \"caa2772a-b8a8-4d65-8b8d-19d9c03c62d6\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.225115 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0eacab42-0fe3-4d23-b00c-81353faa98f8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj\" (UID: \"0eacab42-0fe3-4d23-b00c-81353faa98f8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" Mar 18 15:54:01 crc kubenswrapper[4696]: E0318 15:54:01.225128 4696 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 15:54:01 crc kubenswrapper[4696]: E0318 15:54:01.225212 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-webhook-certs podName:caa2772a-b8a8-4d65-8b8d-19d9c03c62d6 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:01.72518673 +0000 UTC m=+1084.731360946 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-webhook-certs") pod "openstack-operator-controller-manager-65fbdb4fdd-njrtk" (UID: "caa2772a-b8a8-4d65-8b8d-19d9c03c62d6") : secret "webhook-server-cert" not found Mar 18 15:54:01 crc kubenswrapper[4696]: E0318 15:54:01.225264 4696 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:54:01 crc kubenswrapper[4696]: E0318 15:54:01.225313 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0eacab42-0fe3-4d23-b00c-81353faa98f8-cert podName:0eacab42-0fe3-4d23-b00c-81353faa98f8 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:02.225297723 +0000 UTC m=+1085.231471989 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0eacab42-0fe3-4d23-b00c-81353faa98f8-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" (UID: "0eacab42-0fe3-4d23-b00c-81353faa98f8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:54:01 crc kubenswrapper[4696]: E0318 15:54:01.225352 4696 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 15:54:01 crc kubenswrapper[4696]: E0318 15:54:01.225373 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-metrics-certs podName:caa2772a-b8a8-4d65-8b8d-19d9c03c62d6 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:01.725365814 +0000 UTC m=+1084.731540020 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-metrics-certs") pod "openstack-operator-controller-manager-65fbdb4fdd-njrtk" (UID: "caa2772a-b8a8-4d65-8b8d-19d9c03c62d6") : secret "metrics-server-cert" not found Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.243262 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-8hndt" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.269399 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgww9\" (UniqueName: \"kubernetes.io/projected/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-kube-api-access-vgww9\") pod \"openstack-operator-controller-manager-65fbdb4fdd-njrtk\" (UID: \"caa2772a-b8a8-4d65-8b8d-19d9c03c62d6\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.326798 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dswfm\" (UniqueName: \"kubernetes.io/projected/b0eb1bc0-9e8c-4836-b7e4-32be5e48bed4-kube-api-access-dswfm\") pod \"rabbitmq-cluster-operator-manager-668c99d594-9rttk\" (UID: \"b0eb1bc0-9e8c-4836-b7e4-32be5e48bed4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9rttk" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.350648 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dswfm\" (UniqueName: \"kubernetes.io/projected/b0eb1bc0-9e8c-4836-b7e4-32be5e48bed4-kube-api-access-dswfm\") pod \"rabbitmq-cluster-operator-manager-668c99d594-9rttk\" (UID: \"b0eb1bc0-9e8c-4836-b7e4-32be5e48bed4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9rttk" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.361729 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-r4qqr" event={"ID":"00c9dc5c-b65b-4f58-8ef8-06ebc3ec0aac","Type":"ContainerStarted","Data":"5ea6e42e7b79f988c66aec639a4077b476b0cc8b200dd2c258aedd54bba1f2f6"} Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.447840 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cfd84c587-g2jrg"] Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.600045 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9rttk" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.632941 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5ef6f08-4538-435c-b5c8-42bac561d200-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-gggxc\" (UID: \"e5ef6f08-4538-435c-b5c8-42bac561d200\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-gggxc" Mar 18 15:54:01 crc kubenswrapper[4696]: E0318 15:54:01.633169 4696 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 15:54:01 crc kubenswrapper[4696]: E0318 15:54:01.633264 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5ef6f08-4538-435c-b5c8-42bac561d200-cert podName:e5ef6f08-4538-435c-b5c8-42bac561d200 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:03.633238844 +0000 UTC m=+1086.639413050 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e5ef6f08-4538-435c-b5c8-42bac561d200-cert") pod "infra-operator-controller-manager-5595c7d6ff-gggxc" (UID: "e5ef6f08-4538-435c-b5c8-42bac561d200") : secret "infra-operator-webhook-server-cert" not found Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.645943 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-66dd9d474d-dfclz"] Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.734595 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-webhook-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-njrtk\" (UID: \"caa2772a-b8a8-4d65-8b8d-19d9c03c62d6\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.735678 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-metrics-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-njrtk\" (UID: \"caa2772a-b8a8-4d65-8b8d-19d9c03c62d6\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:01 crc kubenswrapper[4696]: E0318 15:54:01.736707 4696 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 15:54:01 crc kubenswrapper[4696]: E0318 15:54:01.736813 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-metrics-certs podName:caa2772a-b8a8-4d65-8b8d-19d9c03c62d6 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:02.736792283 +0000 UTC m=+1085.742966489 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-metrics-certs") pod "openstack-operator-controller-manager-65fbdb4fdd-njrtk" (UID: "caa2772a-b8a8-4d65-8b8d-19d9c03c62d6") : secret "metrics-server-cert" not found Mar 18 15:54:01 crc kubenswrapper[4696]: E0318 15:54:01.736819 4696 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 15:54:01 crc kubenswrapper[4696]: E0318 15:54:01.736894 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-webhook-certs podName:caa2772a-b8a8-4d65-8b8d-19d9c03c62d6 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:02.736878465 +0000 UTC m=+1085.743052671 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-webhook-certs") pod "openstack-operator-controller-manager-65fbdb4fdd-njrtk" (UID: "caa2772a-b8a8-4d65-8b8d-19d9c03c62d6") : secret "webhook-server-cert" not found Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.872687 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-76b87776c9-5s8hj"] Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.877729 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-gm92k"] Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.889718 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d559dcdbd-tpb84"] Mar 18 15:54:01 crc kubenswrapper[4696]: I0318 15:54:01.901921 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6d77645966-7s46n"] Mar 18 15:54:01 crc kubenswrapper[4696]: W0318 15:54:01.907498 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod789669f2_e26b_4de8_ad21_801820b5806b.slice/crio-be683c0e84921eef601822980da840ce147ad62dc14c63ef4b5f59f458da9932 WatchSource:0}: Error finding container be683c0e84921eef601822980da840ce147ad62dc14c63ef4b5f59f458da9932: Status 404 returned error can't find the container with id be683c0e84921eef601822980da840ce147ad62dc14c63ef4b5f59f458da9932 Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.075961 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6b77b7676d-5nkd5"] Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.076021 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-64dc66d669-kfqqk"] Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.090321 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-bc5c78db9-blz69"] Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.104409 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-fbf7bbb96-v85hd"] Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.132915 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-56f74467c6-z87fb"] Mar 18 15:54:02 crc kubenswrapper[4696]: W0318 15:54:02.144976 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7082f0a_1b24_4fda_b9b2_eb957c569232.slice/crio-32a99d856eb2754abc6f0aca0927bc88c33a4e99e76f2c8c60141f8f157d3235 WatchSource:0}: Error finding container 32a99d856eb2754abc6f0aca0927bc88c33a4e99e76f2c8c60141f8f157d3235: Status 404 returned error can't find the container with id 32a99d856eb2754abc6f0aca0927bc88c33a4e99e76f2c8c60141f8f157d3235 Mar 18 15:54:02 crc kubenswrapper[4696]: W0318 15:54:02.146167 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafdee753_15ca_42fe_8cc1_937b42d07b85.slice/crio-97f8ccbea3b0f4f0d488888f62dc7f9f636c2160bed4c9b6b4001e16a9ccba1f WatchSource:0}: Error finding container 97f8ccbea3b0f4f0d488888f62dc7f9f636c2160bed4c9b6b4001e16a9ccba1f: Status 404 returned error can't find the container with id 97f8ccbea3b0f4f0d488888f62dc7f9f636c2160bed4c9b6b4001e16a9ccba1f Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.244579 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0eacab42-0fe3-4d23-b00c-81353faa98f8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj\" (UID: \"0eacab42-0fe3-4d23-b00c-81353faa98f8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" Mar 18 15:54:02 crc kubenswrapper[4696]: E0318 15:54:02.244774 4696 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:54:02 crc kubenswrapper[4696]: E0318 15:54:02.244831 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0eacab42-0fe3-4d23-b00c-81353faa98f8-cert podName:0eacab42-0fe3-4d23-b00c-81353faa98f8 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:04.244813496 +0000 UTC m=+1087.250987702 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0eacab42-0fe3-4d23-b00c-81353faa98f8-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" (UID: "0eacab42-0fe3-4d23-b00c-81353faa98f8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.383344 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-tpb84" event={"ID":"123177a5-da82-4485-990a-d5ced4dbf8ca","Type":"ContainerStarted","Data":"abe22c2ac6838a44a2e54e1fc331c37aaebeb486f3ee2888f95340fada66bc55"} Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.390233 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-5nkd5" event={"ID":"e915aebf-c140-44ee-90b8-ce169df57fd9","Type":"ContainerStarted","Data":"aa7237b0d84086cc798324d4678630e76e60b0bb668c564a9417ad5337b0fc07"} Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.391610 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-blz69" event={"ID":"e7082f0a-1b24-4fda-b9b2-eb957c569232","Type":"ContainerStarted","Data":"32a99d856eb2754abc6f0aca0927bc88c33a4e99e76f2c8c60141f8f157d3235"} Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.398385 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-z87fb" event={"ID":"afdee753-15ca-42fe-8cc1-937b42d07b85","Type":"ContainerStarted","Data":"97f8ccbea3b0f4f0d488888f62dc7f9f636c2160bed4c9b6b4001e16a9ccba1f"} Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.405975 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-dfclz" event={"ID":"1edbcf2a-3a8d-4d2a-8c28-8fe7b66bd7be","Type":"ContainerStarted","Data":"2d662415f5c12464a2e1011ea2ffb1608bb6f7dbecf5ccc9b0f72f3223fa3e7c"} Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.408380 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-kfqqk" event={"ID":"78640d4c-766f-4fd8-ab5f-54687b6fb5c6","Type":"ContainerStarted","Data":"a7e341047e33023d000d69b7001e5b237237245ce0788dab9873feb0d01d584e"} Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.413327 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-5s8hj" event={"ID":"61d78ab1-c6d3-4dfa-a630-5ccddfd04a0f","Type":"ContainerStarted","Data":"bf7d7827f583815182122b3575aaf54de3644575d7faa19a306ee868ae24cd4c"} Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.417279 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-gm92k" event={"ID":"411ef48e-d8ac-471f-9018-ee5fd534a4c9","Type":"ContainerStarted","Data":"5d851fbd697b5721530f336bcd794fa1af19f7f15a61b476026a3c8914192bfe"} Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.419345 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v85hd" event={"ID":"13174b57-caf5-46f2-8605-51e4de880253","Type":"ContainerStarted","Data":"7c78b3e1def32578e0f8d32069e24e64052b1583e6179d4fc0beed350d0bf6c7"} Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.421448 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-g2jrg" event={"ID":"ef87c345-1284-41dd-a5ae-57ae08c9558e","Type":"ContainerStarted","Data":"887a86fe65b2084e7b2ec7902ab1f60f7df07da42d0eedfd1e7d5568e5c18477"} Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.424089 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-7s46n" event={"ID":"789669f2-e26b-4de8-ad21-801820b5806b","Type":"ContainerStarted","Data":"be683c0e84921eef601822980da840ce147ad62dc14c63ef4b5f59f458da9932"} Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.442732 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-659fb58c6b-sbh54"] Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.459667 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-74d6f7b5c-8hndt"] Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.473259 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6744dd545c-crpf5"] Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.485755 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-867f54bc44-wf58k"] Mar 18 15:54:02 crc kubenswrapper[4696]: W0318 15:54:02.486415 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa515a71_3c55_46b7_bab2_60cef0a2b2e1.slice/crio-29e72f9624a02c55f01fdbe3bc2eaed8c3eb19b85b173666fd12b48d0bcc81ed WatchSource:0}: Error finding container 29e72f9624a02c55f01fdbe3bc2eaed8c3eb19b85b173666fd12b48d0bcc81ed: Status 404 returned error can't find the container with id 29e72f9624a02c55f01fdbe3bc2eaed8c3eb19b85b173666fd12b48d0bcc81ed Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.492327 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-p7snf"] Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.501698 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d84559f47-x7vwf"] Mar 18 15:54:02 crc kubenswrapper[4696]: W0318 15:54:02.523976 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod288ae45d_6c8b_4034_8e4d_e2af975bda6f.slice/crio-e859c2133c7f4d413ae77f708c6eb1e563e89beac3b53d678afbdb8bc142caf0 WatchSource:0}: Error finding container e859c2133c7f4d413ae77f708c6eb1e563e89beac3b53d678afbdb8bc142caf0: Status 404 returned error can't find the container with id e859c2133c7f4d413ae77f708c6eb1e563e89beac3b53d678afbdb8bc142caf0 Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.563243 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564154-pd7vl"] Mar 18 15:54:02 crc kubenswrapper[4696]: W0318 15:54:02.567511 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode74d1820_3e14_431f_866b_b0ab8b97f20f.slice/crio-c5b0bbec2d6c9d8601aa17e0a9cf4c9e35a01d13ab1ed5e2a6297a6ff341dff1 WatchSource:0}: Error finding container c5b0bbec2d6c9d8601aa17e0a9cf4c9e35a01d13ab1ed5e2a6297a6ff341dff1: Status 404 returned error can't find the container with id c5b0bbec2d6c9d8601aa17e0a9cf4c9e35a01d13ab1ed5e2a6297a6ff341dff1 Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.574720 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-846c4cdcb7-bn6ct"] Mar 18 15:54:02 crc kubenswrapper[4696]: E0318 15:54:02.574926 4696 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 15:54:02 crc kubenswrapper[4696]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 18 15:54:02 crc kubenswrapper[4696]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5jjnm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29564154-pd7vl_openshift-infra(dd6bdd44-a607-4081-a732-f572001c79af): ErrImagePull: pull QPS exceeded Mar 18 15:54:02 crc kubenswrapper[4696]: > logger="UnhandledError" Mar 18 15:54:02 crc kubenswrapper[4696]: E0318 15:54:02.576125 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"pull QPS exceeded\"" pod="openshift-infra/auto-csr-approver-29564154-pd7vl" podUID="dd6bdd44-a607-4081-a732-f572001c79af" Mar 18 15:54:02 crc kubenswrapper[4696]: W0318 15:54:02.579122 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda08cb64d_e133_4787_956b_4cef003ea78a.slice/crio-6937188b85345a7d879733f9197b94b294255dbb7869f664d6aeab329ee0ab33 WatchSource:0}: Error finding container 6937188b85345a7d879733f9197b94b294255dbb7869f664d6aeab329ee0ab33: Status 404 returned error can't find the container with id 6937188b85345a7d879733f9197b94b294255dbb7869f664d6aeab329ee0ab33 Mar 18 15:54:02 crc kubenswrapper[4696]: E0318 15:54:02.579369 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:8fc146e6a8704846a36a440a636cd36bec5563abcb5f138b651e2522f0b57702,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g9xkd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-846c4cdcb7-bn6ct_openstack-operators(9597433a-1cf7-4455-8aa6-8709fef284dd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 15:54:02 crc kubenswrapper[4696]: E0318 15:54:02.580466 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-bn6ct" podUID="9597433a-1cf7-4455-8aa6-8709fef284dd" Mar 18 15:54:02 crc kubenswrapper[4696]: E0318 15:54:02.582784 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:45611eb2b721d1e59ac25f7308fb063e561e8dd81a5824ec5d3952eb066b63f6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zflsn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-6744dd545c-crpf5_openstack-operators(e74d1820-3e14-431f-866b-b0ab8b97f20f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 15:54:02 crc kubenswrapper[4696]: E0318 15:54:02.583348 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:88f2db101f619563231cfa13f4488596637731f0ebe33c661d4a5e48a86dd3e8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r69kg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6d84559f47-x7vwf_openstack-operators(a08cb64d-e133-4787-956b-4cef003ea78a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 15:54:02 crc kubenswrapper[4696]: E0318 15:54:02.583847 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-crpf5" podUID="e74d1820-3e14-431f-866b-b0ab8b97f20f" Mar 18 15:54:02 crc kubenswrapper[4696]: E0318 15:54:02.584968 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-x7vwf" podUID="a08cb64d-e133-4787-956b-4cef003ea78a" Mar 18 15:54:02 crc kubenswrapper[4696]: W0318 15:54:02.592253 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0eb1bc0_9e8c_4836_b7e4_32be5e48bed4.slice/crio-3bb7554e41db9935a9bc6420343b6ccc205c0a2d2d06a6516543e89ee059e007 WatchSource:0}: Error finding container 3bb7554e41db9935a9bc6420343b6ccc205c0a2d2d06a6516543e89ee059e007: Status 404 returned error can't find the container with id 3bb7554e41db9935a9bc6420343b6ccc205c0a2d2d06a6516543e89ee059e007 Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.593402 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9rttk"] Mar 18 15:54:02 crc kubenswrapper[4696]: E0318 15:54:02.600780 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dswfm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-9rttk_openstack-operators(b0eb1bc0-9e8c-4836-b7e4-32be5e48bed4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 15:54:02 crc kubenswrapper[4696]: E0318 15:54:02.602868 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9rttk" podUID="b0eb1bc0-9e8c-4836-b7e4-32be5e48bed4" Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.753202 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-webhook-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-njrtk\" (UID: \"caa2772a-b8a8-4d65-8b8d-19d9c03c62d6\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:02 crc kubenswrapper[4696]: I0318 15:54:02.753275 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-metrics-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-njrtk\" (UID: \"caa2772a-b8a8-4d65-8b8d-19d9c03c62d6\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:02 crc kubenswrapper[4696]: E0318 15:54:02.753369 4696 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 15:54:02 crc kubenswrapper[4696]: E0318 15:54:02.753434 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-webhook-certs podName:caa2772a-b8a8-4d65-8b8d-19d9c03c62d6 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:04.753417194 +0000 UTC m=+1087.759591400 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-webhook-certs") pod "openstack-operator-controller-manager-65fbdb4fdd-njrtk" (UID: "caa2772a-b8a8-4d65-8b8d-19d9c03c62d6") : secret "webhook-server-cert" not found Mar 18 15:54:02 crc kubenswrapper[4696]: E0318 15:54:02.754037 4696 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 15:54:02 crc kubenswrapper[4696]: E0318 15:54:02.754111 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-metrics-certs podName:caa2772a-b8a8-4d65-8b8d-19d9c03c62d6 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:04.754090231 +0000 UTC m=+1087.760264557 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-metrics-certs") pod "openstack-operator-controller-manager-65fbdb4fdd-njrtk" (UID: "caa2772a-b8a8-4d65-8b8d-19d9c03c62d6") : secret "metrics-server-cert" not found Mar 18 15:54:03 crc kubenswrapper[4696]: I0318 15:54:03.461057 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-sbh54" event={"ID":"48afb8e4-ed3d-4c76-9be0-15279dda8889","Type":"ContainerStarted","Data":"73c40d8ced97e338f455196ad547265b79291b1d335b9ff846bbb2f4f0d7fb6e"} Mar 18 15:54:03 crc kubenswrapper[4696]: I0318 15:54:03.467883 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564154-pd7vl" event={"ID":"dd6bdd44-a607-4081-a732-f572001c79af","Type":"ContainerStarted","Data":"8cc2c90fd1c9eff2e4a4d397b86270aa685e8e6f1c15d09ee9b975dd36aaeb94"} Mar 18 15:54:03 crc kubenswrapper[4696]: E0318 15:54:03.470010 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29564154-pd7vl" podUID="dd6bdd44-a607-4081-a732-f572001c79af" Mar 18 15:54:03 crc kubenswrapper[4696]: I0318 15:54:03.473694 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-crpf5" event={"ID":"e74d1820-3e14-431f-866b-b0ab8b97f20f","Type":"ContainerStarted","Data":"c5b0bbec2d6c9d8601aa17e0a9cf4c9e35a01d13ab1ed5e2a6297a6ff341dff1"} Mar 18 15:54:03 crc kubenswrapper[4696]: E0318 15:54:03.478913 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:45611eb2b721d1e59ac25f7308fb063e561e8dd81a5824ec5d3952eb066b63f6\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-crpf5" podUID="e74d1820-3e14-431f-866b-b0ab8b97f20f" Mar 18 15:54:03 crc kubenswrapper[4696]: I0318 15:54:03.486104 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-8hndt" event={"ID":"fa515a71-3c55-46b7-bab2-60cef0a2b2e1","Type":"ContainerStarted","Data":"29e72f9624a02c55f01fdbe3bc2eaed8c3eb19b85b173666fd12b48d0bcc81ed"} Mar 18 15:54:03 crc kubenswrapper[4696]: I0318 15:54:03.492679 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-wf58k" event={"ID":"288ae45d-6c8b-4034-8e4d-e2af975bda6f","Type":"ContainerStarted","Data":"e859c2133c7f4d413ae77f708c6eb1e563e89beac3b53d678afbdb8bc142caf0"} Mar 18 15:54:03 crc kubenswrapper[4696]: I0318 15:54:03.510604 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-bn6ct" event={"ID":"9597433a-1cf7-4455-8aa6-8709fef284dd","Type":"ContainerStarted","Data":"12441d729e1167027c03b9f06097492c456f2cc596912b398ccbbc7b8537032a"} Mar 18 15:54:03 crc kubenswrapper[4696]: E0318 15:54:03.530429 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:8fc146e6a8704846a36a440a636cd36bec5563abcb5f138b651e2522f0b57702\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-bn6ct" podUID="9597433a-1cf7-4455-8aa6-8709fef284dd" Mar 18 15:54:03 crc kubenswrapper[4696]: I0318 15:54:03.530847 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-x7vwf" event={"ID":"a08cb64d-e133-4787-956b-4cef003ea78a","Type":"ContainerStarted","Data":"6937188b85345a7d879733f9197b94b294255dbb7869f664d6aeab329ee0ab33"} Mar 18 15:54:03 crc kubenswrapper[4696]: E0318 15:54:03.534301 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:88f2db101f619563231cfa13f4488596637731f0ebe33c661d4a5e48a86dd3e8\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-x7vwf" podUID="a08cb64d-e133-4787-956b-4cef003ea78a" Mar 18 15:54:03 crc kubenswrapper[4696]: I0318 15:54:03.536106 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-p7snf" event={"ID":"06cdd947-c4dd-4ccf-bb4b-fffef57443d4","Type":"ContainerStarted","Data":"76057473d4349efd1dddbe8d5656b4399853116b8af2e0171d3b9b46fa0aefe3"} Mar 18 15:54:03 crc kubenswrapper[4696]: I0318 15:54:03.547375 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9rttk" event={"ID":"b0eb1bc0-9e8c-4836-b7e4-32be5e48bed4","Type":"ContainerStarted","Data":"3bb7554e41db9935a9bc6420343b6ccc205c0a2d2d06a6516543e89ee059e007"} Mar 18 15:54:03 crc kubenswrapper[4696]: E0318 15:54:03.549295 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9rttk" podUID="b0eb1bc0-9e8c-4836-b7e4-32be5e48bed4" Mar 18 15:54:03 crc kubenswrapper[4696]: I0318 15:54:03.673445 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5ef6f08-4538-435c-b5c8-42bac561d200-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-gggxc\" (UID: \"e5ef6f08-4538-435c-b5c8-42bac561d200\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-gggxc" Mar 18 15:54:03 crc kubenswrapper[4696]: E0318 15:54:03.674450 4696 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 15:54:03 crc kubenswrapper[4696]: E0318 15:54:03.674506 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5ef6f08-4538-435c-b5c8-42bac561d200-cert podName:e5ef6f08-4538-435c-b5c8-42bac561d200 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:07.674489146 +0000 UTC m=+1090.680663342 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e5ef6f08-4538-435c-b5c8-42bac561d200-cert") pod "infra-operator-controller-manager-5595c7d6ff-gggxc" (UID: "e5ef6f08-4538-435c-b5c8-42bac561d200") : secret "infra-operator-webhook-server-cert" not found Mar 18 15:54:04 crc kubenswrapper[4696]: I0318 15:54:04.282560 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0eacab42-0fe3-4d23-b00c-81353faa98f8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj\" (UID: \"0eacab42-0fe3-4d23-b00c-81353faa98f8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" Mar 18 15:54:04 crc kubenswrapper[4696]: E0318 15:54:04.282829 4696 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:54:04 crc kubenswrapper[4696]: E0318 15:54:04.282949 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0eacab42-0fe3-4d23-b00c-81353faa98f8-cert podName:0eacab42-0fe3-4d23-b00c-81353faa98f8 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:08.282927921 +0000 UTC m=+1091.289102127 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0eacab42-0fe3-4d23-b00c-81353faa98f8-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" (UID: "0eacab42-0fe3-4d23-b00c-81353faa98f8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:54:04 crc kubenswrapper[4696]: E0318 15:54:04.592411 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:45611eb2b721d1e59ac25f7308fb063e561e8dd81a5824ec5d3952eb066b63f6\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-crpf5" podUID="e74d1820-3e14-431f-866b-b0ab8b97f20f" Mar 18 15:54:04 crc kubenswrapper[4696]: E0318 15:54:04.592489 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9rttk" podUID="b0eb1bc0-9e8c-4836-b7e4-32be5e48bed4" Mar 18 15:54:04 crc kubenswrapper[4696]: E0318 15:54:04.592894 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:8fc146e6a8704846a36a440a636cd36bec5563abcb5f138b651e2522f0b57702\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-bn6ct" podUID="9597433a-1cf7-4455-8aa6-8709fef284dd" Mar 18 15:54:04 crc kubenswrapper[4696]: E0318 15:54:04.606597 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:88f2db101f619563231cfa13f4488596637731f0ebe33c661d4a5e48a86dd3e8\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-x7vwf" podUID="a08cb64d-e133-4787-956b-4cef003ea78a" Mar 18 15:54:04 crc kubenswrapper[4696]: I0318 15:54:04.798838 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-webhook-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-njrtk\" (UID: \"caa2772a-b8a8-4d65-8b8d-19d9c03c62d6\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:04 crc kubenswrapper[4696]: I0318 15:54:04.798909 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-metrics-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-njrtk\" (UID: \"caa2772a-b8a8-4d65-8b8d-19d9c03c62d6\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:04 crc kubenswrapper[4696]: E0318 15:54:04.799174 4696 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 15:54:04 crc kubenswrapper[4696]: E0318 15:54:04.799238 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-metrics-certs podName:caa2772a-b8a8-4d65-8b8d-19d9c03c62d6 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:08.799219551 +0000 UTC m=+1091.805393757 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-metrics-certs") pod "openstack-operator-controller-manager-65fbdb4fdd-njrtk" (UID: "caa2772a-b8a8-4d65-8b8d-19d9c03c62d6") : secret "metrics-server-cert" not found Mar 18 15:54:04 crc kubenswrapper[4696]: E0318 15:54:04.799673 4696 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 15:54:04 crc kubenswrapper[4696]: E0318 15:54:04.799709 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-webhook-certs podName:caa2772a-b8a8-4d65-8b8d-19d9c03c62d6 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:08.799697493 +0000 UTC m=+1091.805871699 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-webhook-certs") pod "openstack-operator-controller-manager-65fbdb4fdd-njrtk" (UID: "caa2772a-b8a8-4d65-8b8d-19d9c03c62d6") : secret "webhook-server-cert" not found Mar 18 15:54:05 crc kubenswrapper[4696]: E0318 15:54:05.782465 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29564154-pd7vl" podUID="dd6bdd44-a607-4081-a732-f572001c79af" Mar 18 15:54:07 crc kubenswrapper[4696]: I0318 15:54:07.675133 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5ef6f08-4538-435c-b5c8-42bac561d200-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-gggxc\" (UID: \"e5ef6f08-4538-435c-b5c8-42bac561d200\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-gggxc" Mar 18 15:54:07 crc kubenswrapper[4696]: E0318 15:54:07.675355 4696 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 15:54:07 crc kubenswrapper[4696]: E0318 15:54:07.675607 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5ef6f08-4538-435c-b5c8-42bac561d200-cert podName:e5ef6f08-4538-435c-b5c8-42bac561d200 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:15.675587217 +0000 UTC m=+1098.681761423 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e5ef6f08-4538-435c-b5c8-42bac561d200-cert") pod "infra-operator-controller-manager-5595c7d6ff-gggxc" (UID: "e5ef6f08-4538-435c-b5c8-42bac561d200") : secret "infra-operator-webhook-server-cert" not found Mar 18 15:54:08 crc kubenswrapper[4696]: I0318 15:54:08.284474 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0eacab42-0fe3-4d23-b00c-81353faa98f8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj\" (UID: \"0eacab42-0fe3-4d23-b00c-81353faa98f8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" Mar 18 15:54:08 crc kubenswrapper[4696]: E0318 15:54:08.284832 4696 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:54:08 crc kubenswrapper[4696]: E0318 15:54:08.284986 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0eacab42-0fe3-4d23-b00c-81353faa98f8-cert podName:0eacab42-0fe3-4d23-b00c-81353faa98f8 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:16.284947365 +0000 UTC m=+1099.291121571 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0eacab42-0fe3-4d23-b00c-81353faa98f8-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" (UID: "0eacab42-0fe3-4d23-b00c-81353faa98f8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:54:08 crc kubenswrapper[4696]: I0318 15:54:08.892364 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-webhook-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-njrtk\" (UID: \"caa2772a-b8a8-4d65-8b8d-19d9c03c62d6\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:08 crc kubenswrapper[4696]: I0318 15:54:08.892423 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-metrics-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-njrtk\" (UID: \"caa2772a-b8a8-4d65-8b8d-19d9c03c62d6\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:08 crc kubenswrapper[4696]: E0318 15:54:08.892590 4696 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 15:54:08 crc kubenswrapper[4696]: E0318 15:54:08.892686 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-webhook-certs podName:caa2772a-b8a8-4d65-8b8d-19d9c03c62d6 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:16.892663261 +0000 UTC m=+1099.898837467 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-webhook-certs") pod "openstack-operator-controller-manager-65fbdb4fdd-njrtk" (UID: "caa2772a-b8a8-4d65-8b8d-19d9c03c62d6") : secret "webhook-server-cert" not found Mar 18 15:54:08 crc kubenswrapper[4696]: E0318 15:54:08.892602 4696 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 15:54:08 crc kubenswrapper[4696]: E0318 15:54:08.893078 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-metrics-certs podName:caa2772a-b8a8-4d65-8b8d-19d9c03c62d6 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:16.893069512 +0000 UTC m=+1099.899243718 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-metrics-certs") pod "openstack-operator-controller-manager-65fbdb4fdd-njrtk" (UID: "caa2772a-b8a8-4d65-8b8d-19d9c03c62d6") : secret "metrics-server-cert" not found Mar 18 15:54:14 crc kubenswrapper[4696]: E0318 15:54:14.295204 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:8cff216ce54922d6d182d9f5dcd0d6bc51d6560e808319c7e20487ee7b6474d1" Mar 18 15:54:14 crc kubenswrapper[4696]: E0318 15:54:14.295810 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:8cff216ce54922d6d182d9f5dcd0d6bc51d6560e808319c7e20487ee7b6474d1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-krxls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-66dd9d474d-dfclz_openstack-operators(1edbcf2a-3a8d-4d2a-8c28-8fe7b66bd7be): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:54:14 crc kubenswrapper[4696]: E0318 15:54:14.297058 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-dfclz" podUID="1edbcf2a-3a8d-4d2a-8c28-8fe7b66bd7be" Mar 18 15:54:14 crc kubenswrapper[4696]: E0318 15:54:14.680745 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:8cff216ce54922d6d182d9f5dcd0d6bc51d6560e808319c7e20487ee7b6474d1\\\"\"" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-dfclz" podUID="1edbcf2a-3a8d-4d2a-8c28-8fe7b66bd7be" Mar 18 15:54:15 crc kubenswrapper[4696]: I0318 15:54:15.600296 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 15:54:15 crc kubenswrapper[4696]: I0318 15:54:15.703888 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5ef6f08-4538-435c-b5c8-42bac561d200-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-gggxc\" (UID: \"e5ef6f08-4538-435c-b5c8-42bac561d200\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-gggxc" Mar 18 15:54:15 crc kubenswrapper[4696]: E0318 15:54:15.704057 4696 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 15:54:15 crc kubenswrapper[4696]: E0318 15:54:15.704488 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5ef6f08-4538-435c-b5c8-42bac561d200-cert podName:e5ef6f08-4538-435c-b5c8-42bac561d200 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:31.704468696 +0000 UTC m=+1114.710642902 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e5ef6f08-4538-435c-b5c8-42bac561d200-cert") pod "infra-operator-controller-manager-5595c7d6ff-gggxc" (UID: "e5ef6f08-4538-435c-b5c8-42bac561d200") : secret "infra-operator-webhook-server-cert" not found Mar 18 15:54:15 crc kubenswrapper[4696]: E0318 15:54:15.754300 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:4f487da837018bfcd11dd794ba8f4dacc839b92e0d060c146fd1f771d750abf8" Mar 18 15:54:15 crc kubenswrapper[4696]: E0318 15:54:15.754583 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:4f487da837018bfcd11dd794ba8f4dacc839b92e0d060c146fd1f771d750abf8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-swstm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6f5b7bcd4-gm92k_openstack-operators(411ef48e-d8ac-471f-9018-ee5fd534a4c9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:54:15 crc kubenswrapper[4696]: E0318 15:54:15.755819 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-gm92k" podUID="411ef48e-d8ac-471f-9018-ee5fd534a4c9" Mar 18 15:54:16 crc kubenswrapper[4696]: I0318 15:54:16.313480 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0eacab42-0fe3-4d23-b00c-81353faa98f8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj\" (UID: \"0eacab42-0fe3-4d23-b00c-81353faa98f8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" Mar 18 15:54:16 crc kubenswrapper[4696]: E0318 15:54:16.313675 4696 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:54:16 crc kubenswrapper[4696]: E0318 15:54:16.313752 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0eacab42-0fe3-4d23-b00c-81353faa98f8-cert podName:0eacab42-0fe3-4d23-b00c-81353faa98f8 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:32.313730401 +0000 UTC m=+1115.319904607 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0eacab42-0fe3-4d23-b00c-81353faa98f8-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" (UID: "0eacab42-0fe3-4d23-b00c-81353faa98f8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:54:16 crc kubenswrapper[4696]: E0318 15:54:16.417792 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27" Mar 18 15:54:16 crc kubenswrapper[4696]: E0318 15:54:16.418000 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q5zv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-8467ccb4c8-p7snf_openstack-operators(06cdd947-c4dd-4ccf-bb4b-fffef57443d4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:54:16 crc kubenswrapper[4696]: E0318 15:54:16.419315 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-p7snf" podUID="06cdd947-c4dd-4ccf-bb4b-fffef57443d4" Mar 18 15:54:16 crc kubenswrapper[4696]: E0318 15:54:16.694746 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:4f487da837018bfcd11dd794ba8f4dacc839b92e0d060c146fd1f771d750abf8\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-gm92k" podUID="411ef48e-d8ac-471f-9018-ee5fd534a4c9" Mar 18 15:54:16 crc kubenswrapper[4696]: E0318 15:54:16.695200 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27\\\"\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-p7snf" podUID="06cdd947-c4dd-4ccf-bb4b-fffef57443d4" Mar 18 15:54:16 crc kubenswrapper[4696]: I0318 15:54:16.923290 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-metrics-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-njrtk\" (UID: \"caa2772a-b8a8-4d65-8b8d-19d9c03c62d6\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:16 crc kubenswrapper[4696]: I0318 15:54:16.923445 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-webhook-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-njrtk\" (UID: \"caa2772a-b8a8-4d65-8b8d-19d9c03c62d6\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:16 crc kubenswrapper[4696]: E0318 15:54:16.923603 4696 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 15:54:16 crc kubenswrapper[4696]: E0318 15:54:16.923627 4696 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 15:54:16 crc kubenswrapper[4696]: E0318 15:54:16.923668 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-webhook-certs podName:caa2772a-b8a8-4d65-8b8d-19d9c03c62d6 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:32.923652762 +0000 UTC m=+1115.929826968 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-webhook-certs") pod "openstack-operator-controller-manager-65fbdb4fdd-njrtk" (UID: "caa2772a-b8a8-4d65-8b8d-19d9c03c62d6") : secret "webhook-server-cert" not found Mar 18 15:54:16 crc kubenswrapper[4696]: E0318 15:54:16.923707 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-metrics-certs podName:caa2772a-b8a8-4d65-8b8d-19d9c03c62d6 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:32.923686853 +0000 UTC m=+1115.929861049 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-metrics-certs") pod "openstack-operator-controller-manager-65fbdb4fdd-njrtk" (UID: "caa2772a-b8a8-4d65-8b8d-19d9c03c62d6") : secret "metrics-server-cert" not found Mar 18 15:54:17 crc kubenswrapper[4696]: E0318 15:54:17.010608 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:58c7a088376a952170371a8faf830a4d5586ac3b38d2aaaaf36842a606d9e396" Mar 18 15:54:17 crc kubenswrapper[4696]: E0318 15:54:17.010912 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:58c7a088376a952170371a8faf830a4d5586ac3b38d2aaaaf36842a606d9e396,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dm8pg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-6d77645966-7s46n_openstack-operators(789669f2-e26b-4de8-ad21-801820b5806b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:54:17 crc kubenswrapper[4696]: E0318 15:54:17.013604 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-7s46n" podUID="789669f2-e26b-4de8-ad21-801820b5806b" Mar 18 15:54:17 crc kubenswrapper[4696]: E0318 15:54:17.714755 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:58c7a088376a952170371a8faf830a4d5586ac3b38d2aaaaf36842a606d9e396\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-7s46n" podUID="789669f2-e26b-4de8-ad21-801820b5806b" Mar 18 15:54:18 crc kubenswrapper[4696]: E0318 15:54:18.205745 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:56a4ec82efbed56683a95dd80854da49106f82b909ce3cb1eab9eaffe0e30552" Mar 18 15:54:18 crc kubenswrapper[4696]: E0318 15:54:18.205951 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:56a4ec82efbed56683a95dd80854da49106f82b909ce3cb1eab9eaffe0e30552,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cklfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-bc5c78db9-blz69_openstack-operators(e7082f0a-1b24-4fda-b9b2-eb957c569232): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:54:18 crc kubenswrapper[4696]: E0318 15:54:18.207134 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-blz69" podUID="e7082f0a-1b24-4fda-b9b2-eb957c569232" Mar 18 15:54:18 crc kubenswrapper[4696]: E0318 15:54:18.717602 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:56a4ec82efbed56683a95dd80854da49106f82b909ce3cb1eab9eaffe0e30552\\\"\"" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-blz69" podUID="e7082f0a-1b24-4fda-b9b2-eb957c569232" Mar 18 15:54:18 crc kubenswrapper[4696]: E0318 15:54:18.802466 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:e0de6d1ce11f966d1fe774e78fea18cec82c4b859b012a7c6eb4a49d4fcbd258" Mar 18 15:54:18 crc kubenswrapper[4696]: E0318 15:54:18.802867 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:e0de6d1ce11f966d1fe774e78fea18cec82c4b859b012a7c6eb4a49d4fcbd258,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bxckn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-76b87776c9-5s8hj_openstack-operators(61d78ab1-c6d3-4dfa-a630-5ccddfd04a0f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:54:18 crc kubenswrapper[4696]: E0318 15:54:18.804323 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-5s8hj" podUID="61d78ab1-c6d3-4dfa-a630-5ccddfd04a0f" Mar 18 15:54:19 crc kubenswrapper[4696]: E0318 15:54:19.723425 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:e0de6d1ce11f966d1fe774e78fea18cec82c4b859b012a7c6eb4a49d4fcbd258\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-5s8hj" podUID="61d78ab1-c6d3-4dfa-a630-5ccddfd04a0f" Mar 18 15:54:28 crc kubenswrapper[4696]: E0318 15:54:28.856211 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 18 15:54:28 crc kubenswrapper[4696]: E0318 15:54:28.856910 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dswfm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-9rttk_openstack-operators(b0eb1bc0-9e8c-4836-b7e4-32be5e48bed4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:54:28 crc kubenswrapper[4696]: E0318 15:54:28.858415 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9rttk" podUID="b0eb1bc0-9e8c-4836-b7e4-32be5e48bed4" Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.792338 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-8hndt" event={"ID":"fa515a71-3c55-46b7-bab2-60cef0a2b2e1","Type":"ContainerStarted","Data":"fb0cc4fcadb0d9bfa81221522a8a90d93a8755a52a464ca3551da013a7bde417"} Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.792696 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-8hndt" Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.794382 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-wf58k" event={"ID":"288ae45d-6c8b-4034-8e4d-e2af975bda6f","Type":"ContainerStarted","Data":"812ded0839191d9aaafe0c54efe5756787bc1c032dd5a44af9a4a9bcdcbbd85a"} Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.794562 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-wf58k" Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.796099 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-bn6ct" event={"ID":"9597433a-1cf7-4455-8aa6-8709fef284dd","Type":"ContainerStarted","Data":"a82348d087c37ad727cfc76b5686a1ffb48ed4cb7d19d4b5d8de6a3b216f05dd"} Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.796443 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-bn6ct" Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.797991 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-kfqqk" event={"ID":"78640d4c-766f-4fd8-ab5f-54687b6fb5c6","Type":"ContainerStarted","Data":"776c97adec9fc6f5b03b5fafd2a1c1151096b5032d552b2d0f35f667720d6526"} Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.798131 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-kfqqk" Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.799632 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-r4qqr" event={"ID":"00c9dc5c-b65b-4f58-8ef8-06ebc3ec0aac","Type":"ContainerStarted","Data":"3059e7df2f01e668d000d9ef5f49a897164f93974a9b678a78e2ae56274bed19"} Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.801010 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-tpb84" event={"ID":"123177a5-da82-4485-990a-d5ced4dbf8ca","Type":"ContainerStarted","Data":"dda182d5599ebd3a74208994c9408687ae02e72510f45d4810f593a4f0e5d093"} Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.801347 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-tpb84" Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.802096 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-5nkd5" event={"ID":"e915aebf-c140-44ee-90b8-ce169df57fd9","Type":"ContainerStarted","Data":"5d1354cfc586f90cbfd3f0d61f35185960d82aee038faf1ede30c25f178463fe"} Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.802380 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-5nkd5" Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.803299 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-crpf5" event={"ID":"e74d1820-3e14-431f-866b-b0ab8b97f20f","Type":"ContainerStarted","Data":"117c726c8c2c5a7cade46dff53033fcc387953e044b7b2cc206251ad2ff4f628"} Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.803638 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-crpf5" Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.804366 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-dfclz" event={"ID":"1edbcf2a-3a8d-4d2a-8c28-8fe7b66bd7be","Type":"ContainerStarted","Data":"5e41050f10cc618bfc939e2e47a1d2b4ef427061d62525b8105d92a8f0fb0cbf"} Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.804515 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-dfclz" Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.805225 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v85hd" event={"ID":"13174b57-caf5-46f2-8605-51e4de880253","Type":"ContainerStarted","Data":"36ae15af23aafc58fc071b12735d8342a6a76f4cc0e26c0415c3992428066107"} Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.805581 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v85hd" Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.806442 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564154-pd7vl" event={"ID":"dd6bdd44-a607-4081-a732-f572001c79af","Type":"ContainerStarted","Data":"90ffc7e8b38bf04d741b4bb6b29985262ec4a42b5698b0657694bb0b2467c041"} Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.807601 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-g2jrg" event={"ID":"ef87c345-1284-41dd-a5ae-57ae08c9558e","Type":"ContainerStarted","Data":"d7c976bce96477e7c06b9ea045d3c1ae040b750f1440adb96444f3ff1d82fb33"} Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.807925 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-g2jrg" Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.808847 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-sbh54" event={"ID":"48afb8e4-ed3d-4c76-9be0-15279dda8889","Type":"ContainerStarted","Data":"05612b034eebd49821b87b840a0866178afa88c2a3b62ed6725d391a5e397d6e"} Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.809205 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-sbh54" Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.817093 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-z87fb" event={"ID":"afdee753-15ca-42fe-8cc1-937b42d07b85","Type":"ContainerStarted","Data":"3335026fae449f939d2919c2d9d42aff284485bfc18b1db01a2fad06120d76a7"} Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.817765 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-z87fb" Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.829210 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-x7vwf" event={"ID":"a08cb64d-e133-4787-956b-4cef003ea78a","Type":"ContainerStarted","Data":"1371265705fe199f8dc4038c00507e829d455a896e08d1ee39ae366074d75151"} Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.829455 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-x7vwf" Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.835563 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-8hndt" podStartSLOduration=12.974288364 podStartE2EDuration="29.835546624s" podCreationTimestamp="2026-03-18 15:54:00 +0000 UTC" firstStartedPulling="2026-03-18 15:54:02.525599637 +0000 UTC m=+1085.531773843" lastFinishedPulling="2026-03-18 15:54:19.386857897 +0000 UTC m=+1102.393032103" observedRunningTime="2026-03-18 15:54:29.833992015 +0000 UTC m=+1112.840166221" watchObservedRunningTime="2026-03-18 15:54:29.835546624 +0000 UTC m=+1112.841720830" Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.860667 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-kfqqk" podStartSLOduration=12.978665869 podStartE2EDuration="30.860648042s" podCreationTimestamp="2026-03-18 15:53:59 +0000 UTC" firstStartedPulling="2026-03-18 15:54:02.099457742 +0000 UTC m=+1085.105631938" lastFinishedPulling="2026-03-18 15:54:19.981439905 +0000 UTC m=+1102.987614111" observedRunningTime="2026-03-18 15:54:29.856281302 +0000 UTC m=+1112.862455518" watchObservedRunningTime="2026-03-18 15:54:29.860648042 +0000 UTC m=+1112.866822238" Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.954003 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-r4qqr" Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.958706 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-x7vwf" podStartSLOduration=3.622198298 podStartE2EDuration="29.958685123s" podCreationTimestamp="2026-03-18 15:54:00 +0000 UTC" firstStartedPulling="2026-03-18 15:54:02.583277129 +0000 UTC m=+1085.589451335" lastFinishedPulling="2026-03-18 15:54:28.919763954 +0000 UTC m=+1111.925938160" observedRunningTime="2026-03-18 15:54:29.954275373 +0000 UTC m=+1112.960449589" watchObservedRunningTime="2026-03-18 15:54:29.958685123 +0000 UTC m=+1112.964859329" Mar 18 15:54:29 crc kubenswrapper[4696]: I0318 15:54:29.961289 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-z87fb" podStartSLOduration=12.723479044 podStartE2EDuration="29.961280438s" podCreationTimestamp="2026-03-18 15:54:00 +0000 UTC" firstStartedPulling="2026-03-18 15:54:02.149272818 +0000 UTC m=+1085.155447024" lastFinishedPulling="2026-03-18 15:54:19.387074212 +0000 UTC m=+1102.393248418" observedRunningTime="2026-03-18 15:54:29.91096095 +0000 UTC m=+1112.917135156" watchObservedRunningTime="2026-03-18 15:54:29.961280438 +0000 UTC m=+1112.967454644" Mar 18 15:54:30 crc kubenswrapper[4696]: I0318 15:54:30.040192 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-crpf5" podStartSLOduration=4.684433335 podStartE2EDuration="31.040169611s" podCreationTimestamp="2026-03-18 15:53:59 +0000 UTC" firstStartedPulling="2026-03-18 15:54:02.582693805 +0000 UTC m=+1085.588868011" lastFinishedPulling="2026-03-18 15:54:28.938430081 +0000 UTC m=+1111.944604287" observedRunningTime="2026-03-18 15:54:30.035893174 +0000 UTC m=+1113.042067390" watchObservedRunningTime="2026-03-18 15:54:30.040169611 +0000 UTC m=+1113.046343827" Mar 18 15:54:30 crc kubenswrapper[4696]: I0318 15:54:30.040424 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v85hd" podStartSLOduration=13.792416317 podStartE2EDuration="31.040417717s" podCreationTimestamp="2026-03-18 15:53:59 +0000 UTC" firstStartedPulling="2026-03-18 15:54:02.139087763 +0000 UTC m=+1085.145261969" lastFinishedPulling="2026-03-18 15:54:19.387089163 +0000 UTC m=+1102.393263369" observedRunningTime="2026-03-18 15:54:30.001864563 +0000 UTC m=+1113.008038779" watchObservedRunningTime="2026-03-18 15:54:30.040417717 +0000 UTC m=+1113.046591933" Mar 18 15:54:30 crc kubenswrapper[4696]: I0318 15:54:30.139738 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-5nkd5" podStartSLOduration=13.257129182 podStartE2EDuration="31.13971706s" podCreationTimestamp="2026-03-18 15:53:59 +0000 UTC" firstStartedPulling="2026-03-18 15:54:02.098893028 +0000 UTC m=+1085.105067234" lastFinishedPulling="2026-03-18 15:54:19.981480906 +0000 UTC m=+1102.987655112" observedRunningTime="2026-03-18 15:54:30.127902975 +0000 UTC m=+1113.134077181" watchObservedRunningTime="2026-03-18 15:54:30.13971706 +0000 UTC m=+1113.145891266" Mar 18 15:54:30 crc kubenswrapper[4696]: I0318 15:54:30.228229 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-g2jrg" podStartSLOduration=13.386898296 podStartE2EDuration="31.228204323s" podCreationTimestamp="2026-03-18 15:53:59 +0000 UTC" firstStartedPulling="2026-03-18 15:54:01.545583141 +0000 UTC m=+1084.551757347" lastFinishedPulling="2026-03-18 15:54:19.386889168 +0000 UTC m=+1102.393063374" observedRunningTime="2026-03-18 15:54:30.169877324 +0000 UTC m=+1113.176051530" watchObservedRunningTime="2026-03-18 15:54:30.228204323 +0000 UTC m=+1113.234378529" Mar 18 15:54:30 crc kubenswrapper[4696]: I0318 15:54:30.279506 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-wf58k" podStartSLOduration=5.178371312 podStartE2EDuration="30.279492345s" podCreationTimestamp="2026-03-18 15:54:00 +0000 UTC" firstStartedPulling="2026-03-18 15:54:02.550238723 +0000 UTC m=+1085.556412929" lastFinishedPulling="2026-03-18 15:54:27.651359756 +0000 UTC m=+1110.657533962" observedRunningTime="2026-03-18 15:54:30.277956787 +0000 UTC m=+1113.284131003" watchObservedRunningTime="2026-03-18 15:54:30.279492345 +0000 UTC m=+1113.285666551" Mar 18 15:54:30 crc kubenswrapper[4696]: I0318 15:54:30.283792 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-bn6ct" podStartSLOduration=3.935428871 podStartE2EDuration="30.283773042s" podCreationTimestamp="2026-03-18 15:54:00 +0000 UTC" firstStartedPulling="2026-03-18 15:54:02.579230828 +0000 UTC m=+1085.585405034" lastFinishedPulling="2026-03-18 15:54:28.927574999 +0000 UTC m=+1111.933749205" observedRunningTime="2026-03-18 15:54:30.232831638 +0000 UTC m=+1113.239005844" watchObservedRunningTime="2026-03-18 15:54:30.283773042 +0000 UTC m=+1113.289947248" Mar 18 15:54:30 crc kubenswrapper[4696]: I0318 15:54:30.342649 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564154-pd7vl" podStartSLOduration=3.860378055 podStartE2EDuration="30.342632564s" podCreationTimestamp="2026-03-18 15:54:00 +0000 UTC" firstStartedPulling="2026-03-18 15:54:02.574413858 +0000 UTC m=+1085.580588064" lastFinishedPulling="2026-03-18 15:54:29.056668367 +0000 UTC m=+1112.062842573" observedRunningTime="2026-03-18 15:54:30.3280734 +0000 UTC m=+1113.334247606" watchObservedRunningTime="2026-03-18 15:54:30.342632564 +0000 UTC m=+1113.348806770" Mar 18 15:54:30 crc kubenswrapper[4696]: I0318 15:54:30.368604 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-r4qqr" podStartSLOduration=13.1718753 podStartE2EDuration="31.368581163s" podCreationTimestamp="2026-03-18 15:53:59 +0000 UTC" firstStartedPulling="2026-03-18 15:54:01.190152834 +0000 UTC m=+1084.196327040" lastFinishedPulling="2026-03-18 15:54:19.386858697 +0000 UTC m=+1102.393032903" observedRunningTime="2026-03-18 15:54:30.363427684 +0000 UTC m=+1113.369601900" watchObservedRunningTime="2026-03-18 15:54:30.368581163 +0000 UTC m=+1113.374755369" Mar 18 15:54:30 crc kubenswrapper[4696]: I0318 15:54:30.435935 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-dfclz" podStartSLOduration=4.029773216 podStartE2EDuration="31.435910207s" podCreationTimestamp="2026-03-18 15:53:59 +0000 UTC" firstStartedPulling="2026-03-18 15:54:01.661809388 +0000 UTC m=+1084.667983604" lastFinishedPulling="2026-03-18 15:54:29.067946389 +0000 UTC m=+1112.074120595" observedRunningTime="2026-03-18 15:54:30.428908831 +0000 UTC m=+1113.435083037" watchObservedRunningTime="2026-03-18 15:54:30.435910207 +0000 UTC m=+1113.442084423" Mar 18 15:54:30 crc kubenswrapper[4696]: I0318 15:54:30.466024 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-sbh54" podStartSLOduration=13.010100239 podStartE2EDuration="30.466003809s" podCreationTimestamp="2026-03-18 15:54:00 +0000 UTC" firstStartedPulling="2026-03-18 15:54:02.525657298 +0000 UTC m=+1085.531831504" lastFinishedPulling="2026-03-18 15:54:19.981560868 +0000 UTC m=+1102.987735074" observedRunningTime="2026-03-18 15:54:30.464999544 +0000 UTC m=+1113.471173770" watchObservedRunningTime="2026-03-18 15:54:30.466003809 +0000 UTC m=+1113.472178015" Mar 18 15:54:30 crc kubenswrapper[4696]: I0318 15:54:30.643985 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-tpb84" podStartSLOduration=5.896977007 podStartE2EDuration="31.643965509s" podCreationTimestamp="2026-03-18 15:53:59 +0000 UTC" firstStartedPulling="2026-03-18 15:54:01.904353793 +0000 UTC m=+1084.910527999" lastFinishedPulling="2026-03-18 15:54:27.651342295 +0000 UTC m=+1110.657516501" observedRunningTime="2026-03-18 15:54:30.498636955 +0000 UTC m=+1113.504811421" watchObservedRunningTime="2026-03-18 15:54:30.643965509 +0000 UTC m=+1113.650139725" Mar 18 15:54:30 crc kubenswrapper[4696]: I0318 15:54:30.846687 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-p7snf" event={"ID":"06cdd947-c4dd-4ccf-bb4b-fffef57443d4","Type":"ContainerStarted","Data":"d6301ee926991838653d99cfa3ebf68b549085ec50f86f69a3a37fb183dc92e3"} Mar 18 15:54:30 crc kubenswrapper[4696]: I0318 15:54:30.853042 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-p7snf" Mar 18 15:54:30 crc kubenswrapper[4696]: I0318 15:54:30.902671 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-p7snf" podStartSLOduration=3.26513061 podStartE2EDuration="30.902646247s" podCreationTimestamp="2026-03-18 15:54:00 +0000 UTC" firstStartedPulling="2026-03-18 15:54:02.568613753 +0000 UTC m=+1085.574787959" lastFinishedPulling="2026-03-18 15:54:30.20612939 +0000 UTC m=+1113.212303596" observedRunningTime="2026-03-18 15:54:30.896800851 +0000 UTC m=+1113.902975057" watchObservedRunningTime="2026-03-18 15:54:30.902646247 +0000 UTC m=+1113.908820453" Mar 18 15:54:31 crc kubenswrapper[4696]: E0318 15:54:31.206468 4696 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd6bdd44_a607_4081_a732_f572001c79af.slice/crio-90ffc7e8b38bf04d741b4bb6b29985262ec4a42b5698b0657694bb0b2467c041.scope\": RecentStats: unable to find data in memory cache]" Mar 18 15:54:31 crc kubenswrapper[4696]: I0318 15:54:31.724415 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5ef6f08-4538-435c-b5c8-42bac561d200-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-gggxc\" (UID: \"e5ef6f08-4538-435c-b5c8-42bac561d200\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-gggxc" Mar 18 15:54:31 crc kubenswrapper[4696]: I0318 15:54:31.729293 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5ef6f08-4538-435c-b5c8-42bac561d200-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-gggxc\" (UID: \"e5ef6f08-4538-435c-b5c8-42bac561d200\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-gggxc" Mar 18 15:54:31 crc kubenswrapper[4696]: I0318 15:54:31.857448 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-gggxc" Mar 18 15:54:31 crc kubenswrapper[4696]: I0318 15:54:31.858843 4696 generic.go:334] "Generic (PLEG): container finished" podID="dd6bdd44-a607-4081-a732-f572001c79af" containerID="90ffc7e8b38bf04d741b4bb6b29985262ec4a42b5698b0657694bb0b2467c041" exitCode=0 Mar 18 15:54:31 crc kubenswrapper[4696]: I0318 15:54:31.859404 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564154-pd7vl" event={"ID":"dd6bdd44-a607-4081-a732-f572001c79af","Type":"ContainerDied","Data":"90ffc7e8b38bf04d741b4bb6b29985262ec4a42b5698b0657694bb0b2467c041"} Mar 18 15:54:32 crc kubenswrapper[4696]: I0318 15:54:32.335027 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0eacab42-0fe3-4d23-b00c-81353faa98f8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj\" (UID: \"0eacab42-0fe3-4d23-b00c-81353faa98f8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" Mar 18 15:54:32 crc kubenswrapper[4696]: I0318 15:54:32.341379 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0eacab42-0fe3-4d23-b00c-81353faa98f8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj\" (UID: \"0eacab42-0fe3-4d23-b00c-81353faa98f8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" Mar 18 15:54:32 crc kubenswrapper[4696]: I0318 15:54:32.414113 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" Mar 18 15:54:32 crc kubenswrapper[4696]: I0318 15:54:32.669769 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5595c7d6ff-gggxc"] Mar 18 15:54:32 crc kubenswrapper[4696]: I0318 15:54:32.828625 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj"] Mar 18 15:54:32 crc kubenswrapper[4696]: I0318 15:54:32.866245 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" event={"ID":"0eacab42-0fe3-4d23-b00c-81353faa98f8","Type":"ContainerStarted","Data":"555cc317bc8b3f1481609a85cc273f83f1570a6dbd741e3322d4cded31578da1"} Mar 18 15:54:32 crc kubenswrapper[4696]: I0318 15:54:32.867981 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-7s46n" event={"ID":"789669f2-e26b-4de8-ad21-801820b5806b","Type":"ContainerStarted","Data":"f514cbec2e88cbf1a64b0c30d4bbaf89c3cef2cb153ccafd9cca55f0352cf330"} Mar 18 15:54:32 crc kubenswrapper[4696]: I0318 15:54:32.868367 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-7s46n" Mar 18 15:54:32 crc kubenswrapper[4696]: I0318 15:54:32.869151 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-gggxc" event={"ID":"e5ef6f08-4538-435c-b5c8-42bac561d200","Type":"ContainerStarted","Data":"63263c95a9cfd3032eccbf1c3feb202e22fd721184ca714039c8d58fa721cf45"} Mar 18 15:54:32 crc kubenswrapper[4696]: I0318 15:54:32.870977 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-blz69" event={"ID":"e7082f0a-1b24-4fda-b9b2-eb957c569232","Type":"ContainerStarted","Data":"b6a7bd8d078eb86afe8d5f5cdadda850529172c13fc6daeb6dc0c981e6b76c15"} Mar 18 15:54:32 crc kubenswrapper[4696]: I0318 15:54:32.871185 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-blz69" Mar 18 15:54:32 crc kubenswrapper[4696]: I0318 15:54:32.872472 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-gm92k" event={"ID":"411ef48e-d8ac-471f-9018-ee5fd534a4c9","Type":"ContainerStarted","Data":"e36ae1311d6edc603d600935bb0825225e6b11ab65b1600ef48d5bcd5f6c2049"} Mar 18 15:54:32 crc kubenswrapper[4696]: I0318 15:54:32.872752 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-gm92k" Mar 18 15:54:32 crc kubenswrapper[4696]: I0318 15:54:32.888582 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-7s46n" podStartSLOduration=3.715133818 podStartE2EDuration="33.888565817s" podCreationTimestamp="2026-03-18 15:53:59 +0000 UTC" firstStartedPulling="2026-03-18 15:54:01.912412285 +0000 UTC m=+1084.918586491" lastFinishedPulling="2026-03-18 15:54:32.085844284 +0000 UTC m=+1115.092018490" observedRunningTime="2026-03-18 15:54:32.884851284 +0000 UTC m=+1115.891025500" watchObservedRunningTime="2026-03-18 15:54:32.888565817 +0000 UTC m=+1115.894740033" Mar 18 15:54:32 crc kubenswrapper[4696]: I0318 15:54:32.903290 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-blz69" podStartSLOduration=3.598974932 podStartE2EDuration="33.903270725s" podCreationTimestamp="2026-03-18 15:53:59 +0000 UTC" firstStartedPulling="2026-03-18 15:54:02.147473422 +0000 UTC m=+1085.153647628" lastFinishedPulling="2026-03-18 15:54:32.451769215 +0000 UTC m=+1115.457943421" observedRunningTime="2026-03-18 15:54:32.903023099 +0000 UTC m=+1115.909197325" watchObservedRunningTime="2026-03-18 15:54:32.903270725 +0000 UTC m=+1115.909444931" Mar 18 15:54:32 crc kubenswrapper[4696]: I0318 15:54:32.924297 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-gm92k" podStartSLOduration=3.868772119 podStartE2EDuration="33.92427825s" podCreationTimestamp="2026-03-18 15:53:59 +0000 UTC" firstStartedPulling="2026-03-18 15:54:01.894683941 +0000 UTC m=+1084.900858147" lastFinishedPulling="2026-03-18 15:54:31.950190072 +0000 UTC m=+1114.956364278" observedRunningTime="2026-03-18 15:54:32.918903966 +0000 UTC m=+1115.925078172" watchObservedRunningTime="2026-03-18 15:54:32.92427825 +0000 UTC m=+1115.930452456" Mar 18 15:54:32 crc kubenswrapper[4696]: I0318 15:54:32.943949 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-webhook-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-njrtk\" (UID: \"caa2772a-b8a8-4d65-8b8d-19d9c03c62d6\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:32 crc kubenswrapper[4696]: I0318 15:54:32.944066 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-metrics-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-njrtk\" (UID: \"caa2772a-b8a8-4d65-8b8d-19d9c03c62d6\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:32 crc kubenswrapper[4696]: I0318 15:54:32.963942 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-metrics-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-njrtk\" (UID: \"caa2772a-b8a8-4d65-8b8d-19d9c03c62d6\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:32 crc kubenswrapper[4696]: I0318 15:54:32.964047 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/caa2772a-b8a8-4d65-8b8d-19d9c03c62d6-webhook-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-njrtk\" (UID: \"caa2772a-b8a8-4d65-8b8d-19d9c03c62d6\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:33 crc kubenswrapper[4696]: I0318 15:54:33.077252 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:33 crc kubenswrapper[4696]: I0318 15:54:33.146112 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564154-pd7vl" Mar 18 15:54:33 crc kubenswrapper[4696]: I0318 15:54:33.252973 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jjnm\" (UniqueName: \"kubernetes.io/projected/dd6bdd44-a607-4081-a732-f572001c79af-kube-api-access-5jjnm\") pod \"dd6bdd44-a607-4081-a732-f572001c79af\" (UID: \"dd6bdd44-a607-4081-a732-f572001c79af\") " Mar 18 15:54:33 crc kubenswrapper[4696]: I0318 15:54:33.270405 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd6bdd44-a607-4081-a732-f572001c79af-kube-api-access-5jjnm" (OuterVolumeSpecName: "kube-api-access-5jjnm") pod "dd6bdd44-a607-4081-a732-f572001c79af" (UID: "dd6bdd44-a607-4081-a732-f572001c79af"). InnerVolumeSpecName "kube-api-access-5jjnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:54:33 crc kubenswrapper[4696]: I0318 15:54:33.356136 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jjnm\" (UniqueName: \"kubernetes.io/projected/dd6bdd44-a607-4081-a732-f572001c79af-kube-api-access-5jjnm\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:33 crc kubenswrapper[4696]: I0318 15:54:33.405241 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk"] Mar 18 15:54:33 crc kubenswrapper[4696]: I0318 15:54:33.438852 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564148-cf8xr"] Mar 18 15:54:33 crc kubenswrapper[4696]: I0318 15:54:33.444578 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564148-cf8xr"] Mar 18 15:54:33 crc kubenswrapper[4696]: I0318 15:54:33.613001 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7f8828e-3b38-4c94-80ed-bb354c8be9d1" path="/var/lib/kubelet/pods/f7f8828e-3b38-4c94-80ed-bb354c8be9d1/volumes" Mar 18 15:54:33 crc kubenswrapper[4696]: I0318 15:54:33.885332 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564154-pd7vl" event={"ID":"dd6bdd44-a607-4081-a732-f572001c79af","Type":"ContainerDied","Data":"8cc2c90fd1c9eff2e4a4d397b86270aa685e8e6f1c15d09ee9b975dd36aaeb94"} Mar 18 15:54:33 crc kubenswrapper[4696]: I0318 15:54:33.885378 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cc2c90fd1c9eff2e4a4d397b86270aa685e8e6f1c15d09ee9b975dd36aaeb94" Mar 18 15:54:33 crc kubenswrapper[4696]: I0318 15:54:33.885449 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564154-pd7vl" Mar 18 15:54:33 crc kubenswrapper[4696]: I0318 15:54:33.890847 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-5s8hj" event={"ID":"61d78ab1-c6d3-4dfa-a630-5ccddfd04a0f","Type":"ContainerStarted","Data":"8baba31dd9abfceda8dc02f962d5a33c7f6ec45a1792fa915f36a550dc97d867"} Mar 18 15:54:33 crc kubenswrapper[4696]: I0318 15:54:33.891091 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-5s8hj" Mar 18 15:54:33 crc kubenswrapper[4696]: I0318 15:54:33.892738 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" event={"ID":"caa2772a-b8a8-4d65-8b8d-19d9c03c62d6","Type":"ContainerStarted","Data":"0c0c1cb1131f51e7caad2a99c01e985198096b9bab8dcccc2ce264f04217e5ff"} Mar 18 15:54:33 crc kubenswrapper[4696]: I0318 15:54:33.892777 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" event={"ID":"caa2772a-b8a8-4d65-8b8d-19d9c03c62d6","Type":"ContainerStarted","Data":"2ef94b50f4cbe5a717143018cf7abc0354475637ccf2c33fb59789964a6251e4"} Mar 18 15:54:33 crc kubenswrapper[4696]: I0318 15:54:33.928597 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" podStartSLOduration=33.928579564 podStartE2EDuration="33.928579564s" podCreationTimestamp="2026-03-18 15:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:54:33.926715087 +0000 UTC m=+1116.932889303" watchObservedRunningTime="2026-03-18 15:54:33.928579564 +0000 UTC m=+1116.934753770" Mar 18 15:54:33 crc kubenswrapper[4696]: I0318 15:54:33.956176 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-5s8hj" podStartSLOduration=3.467259049 podStartE2EDuration="34.956155573s" podCreationTimestamp="2026-03-18 15:53:59 +0000 UTC" firstStartedPulling="2026-03-18 15:54:01.878805224 +0000 UTC m=+1084.884979430" lastFinishedPulling="2026-03-18 15:54:33.367701748 +0000 UTC m=+1116.373875954" observedRunningTime="2026-03-18 15:54:33.943796314 +0000 UTC m=+1116.949970520" watchObservedRunningTime="2026-03-18 15:54:33.956155573 +0000 UTC m=+1116.962329779" Mar 18 15:54:34 crc kubenswrapper[4696]: I0318 15:54:34.905338 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:36 crc kubenswrapper[4696]: I0318 15:54:36.920912 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" event={"ID":"0eacab42-0fe3-4d23-b00c-81353faa98f8","Type":"ContainerStarted","Data":"e47074cf6683860329d430a7d6e910898d5c083225cbaa10b521c49360c215bb"} Mar 18 15:54:36 crc kubenswrapper[4696]: I0318 15:54:36.921239 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" Mar 18 15:54:36 crc kubenswrapper[4696]: I0318 15:54:36.922134 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-gggxc" event={"ID":"e5ef6f08-4538-435c-b5c8-42bac561d200","Type":"ContainerStarted","Data":"1f94a515f3c136fbbef4658f33fa9916a589158e00c9d88391f5b47d521239c1"} Mar 18 15:54:36 crc kubenswrapper[4696]: I0318 15:54:36.922268 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-gggxc" Mar 18 15:54:36 crc kubenswrapper[4696]: I0318 15:54:36.950983 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" podStartSLOduration=33.245067062 podStartE2EDuration="36.950961891s" podCreationTimestamp="2026-03-18 15:54:00 +0000 UTC" firstStartedPulling="2026-03-18 15:54:32.831859579 +0000 UTC m=+1115.838033775" lastFinishedPulling="2026-03-18 15:54:36.537754398 +0000 UTC m=+1119.543928604" observedRunningTime="2026-03-18 15:54:36.945064263 +0000 UTC m=+1119.951238479" watchObservedRunningTime="2026-03-18 15:54:36.950961891 +0000 UTC m=+1119.957136107" Mar 18 15:54:36 crc kubenswrapper[4696]: I0318 15:54:36.970252 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-gggxc" podStartSLOduration=34.170247681 podStartE2EDuration="37.970229563s" podCreationTimestamp="2026-03-18 15:53:59 +0000 UTC" firstStartedPulling="2026-03-18 15:54:32.687309094 +0000 UTC m=+1115.693483290" lastFinishedPulling="2026-03-18 15:54:36.487290966 +0000 UTC m=+1119.493465172" observedRunningTime="2026-03-18 15:54:36.967599667 +0000 UTC m=+1119.973773883" watchObservedRunningTime="2026-03-18 15:54:36.970229563 +0000 UTC m=+1119.976403779" Mar 18 15:54:39 crc kubenswrapper[4696]: I0318 15:54:39.892236 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-g2jrg" Mar 18 15:54:39 crc kubenswrapper[4696]: I0318 15:54:39.970623 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-r4qqr" Mar 18 15:54:39 crc kubenswrapper[4696]: I0318 15:54:39.970866 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-7s46n" Mar 18 15:54:40 crc kubenswrapper[4696]: I0318 15:54:40.013155 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-dfclz" Mar 18 15:54:40 crc kubenswrapper[4696]: I0318 15:54:40.029589 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-kfqqk" Mar 18 15:54:40 crc kubenswrapper[4696]: I0318 15:54:40.108327 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-5nkd5" Mar 18 15:54:40 crc kubenswrapper[4696]: I0318 15:54:40.290661 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-tpb84" Mar 18 15:54:40 crc kubenswrapper[4696]: I0318 15:54:40.291370 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-5s8hj" Mar 18 15:54:40 crc kubenswrapper[4696]: I0318 15:54:40.399790 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-gm92k" Mar 18 15:54:40 crc kubenswrapper[4696]: I0318 15:54:40.424983 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-blz69" Mar 18 15:54:40 crc kubenswrapper[4696]: I0318 15:54:40.484571 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-z87fb" Mar 18 15:54:40 crc kubenswrapper[4696]: I0318 15:54:40.623392 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v85hd" Mar 18 15:54:40 crc kubenswrapper[4696]: I0318 15:54:40.762149 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-crpf5" Mar 18 15:54:40 crc kubenswrapper[4696]: I0318 15:54:40.967425 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-bn6ct" Mar 18 15:54:41 crc kubenswrapper[4696]: I0318 15:54:41.038304 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-sbh54" Mar 18 15:54:41 crc kubenswrapper[4696]: I0318 15:54:41.085415 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-wf58k" Mar 18 15:54:41 crc kubenswrapper[4696]: I0318 15:54:41.122456 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-x7vwf" Mar 18 15:54:41 crc kubenswrapper[4696]: I0318 15:54:41.211569 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-p7snf" Mar 18 15:54:41 crc kubenswrapper[4696]: I0318 15:54:41.246340 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-8hndt" Mar 18 15:54:41 crc kubenswrapper[4696]: I0318 15:54:41.863734 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-gggxc" Mar 18 15:54:42 crc kubenswrapper[4696]: I0318 15:54:42.423760 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj" Mar 18 15:54:43 crc kubenswrapper[4696]: I0318 15:54:43.086839 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-njrtk" Mar 18 15:54:43 crc kubenswrapper[4696]: E0318 15:54:43.599927 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9rttk" podUID="b0eb1bc0-9e8c-4836-b7e4-32be5e48bed4" Mar 18 15:54:59 crc kubenswrapper[4696]: I0318 15:54:59.129504 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9rttk" event={"ID":"b0eb1bc0-9e8c-4836-b7e4-32be5e48bed4","Type":"ContainerStarted","Data":"399eb5d3c8ae57e8e9746f5baea7e2defd02ded6d98f5bb1d8d8ffebef945fa9"} Mar 18 15:54:59 crc kubenswrapper[4696]: I0318 15:54:59.177712 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9rttk" podStartSLOduration=3.639829799 podStartE2EDuration="59.177683394s" podCreationTimestamp="2026-03-18 15:54:00 +0000 UTC" firstStartedPulling="2026-03-18 15:54:02.600226643 +0000 UTC m=+1085.606400849" lastFinishedPulling="2026-03-18 15:54:58.138080248 +0000 UTC m=+1141.144254444" observedRunningTime="2026-03-18 15:54:59.168980607 +0000 UTC m=+1142.175154813" watchObservedRunningTime="2026-03-18 15:54:59.177683394 +0000 UTC m=+1142.183857620" Mar 18 15:55:07 crc kubenswrapper[4696]: I0318 15:55:07.048382 4696 scope.go:117] "RemoveContainer" containerID="3036c3d85e69fe08d278b0c5a9e67004b058f6abbb75fc52b22058bf268ae98c" Mar 18 15:55:12 crc kubenswrapper[4696]: I0318 15:55:12.184469 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:55:12 crc kubenswrapper[4696]: I0318 15:55:12.184934 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.467093 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jcz7b"] Mar 18 15:55:17 crc kubenswrapper[4696]: E0318 15:55:17.468330 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd6bdd44-a607-4081-a732-f572001c79af" containerName="oc" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.468351 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6bdd44-a607-4081-a732-f572001c79af" containerName="oc" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.468672 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd6bdd44-a607-4081-a732-f572001c79af" containerName="oc" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.469631 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jcz7b" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.472183 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.472730 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-cgsnz" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.474417 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.475687 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.482190 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jcz7b"] Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.547477 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rbdvg"] Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.549140 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-rbdvg" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.553836 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.560182 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rbdvg"] Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.567841 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzw97\" (UniqueName: \"kubernetes.io/projected/3e76dc0e-a823-4edf-9f54-fd8873a6999f-kube-api-access-rzw97\") pod \"dnsmasq-dns-675f4bcbfc-jcz7b\" (UID: \"3e76dc0e-a823-4edf-9f54-fd8873a6999f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jcz7b" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.567901 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e76dc0e-a823-4edf-9f54-fd8873a6999f-config\") pod \"dnsmasq-dns-675f4bcbfc-jcz7b\" (UID: \"3e76dc0e-a823-4edf-9f54-fd8873a6999f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jcz7b" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.669145 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzw97\" (UniqueName: \"kubernetes.io/projected/3e76dc0e-a823-4edf-9f54-fd8873a6999f-kube-api-access-rzw97\") pod \"dnsmasq-dns-675f4bcbfc-jcz7b\" (UID: \"3e76dc0e-a823-4edf-9f54-fd8873a6999f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jcz7b" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.669227 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e76dc0e-a823-4edf-9f54-fd8873a6999f-config\") pod \"dnsmasq-dns-675f4bcbfc-jcz7b\" (UID: \"3e76dc0e-a823-4edf-9f54-fd8873a6999f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jcz7b" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.669272 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1903e45-4556-48d9-b722-4c5cd7fab11d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-rbdvg\" (UID: \"a1903e45-4556-48d9-b722-4c5cd7fab11d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rbdvg" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.669305 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1903e45-4556-48d9-b722-4c5cd7fab11d-config\") pod \"dnsmasq-dns-78dd6ddcc-rbdvg\" (UID: \"a1903e45-4556-48d9-b722-4c5cd7fab11d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rbdvg" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.669328 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz78w\" (UniqueName: \"kubernetes.io/projected/a1903e45-4556-48d9-b722-4c5cd7fab11d-kube-api-access-xz78w\") pod \"dnsmasq-dns-78dd6ddcc-rbdvg\" (UID: \"a1903e45-4556-48d9-b722-4c5cd7fab11d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rbdvg" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.670789 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e76dc0e-a823-4edf-9f54-fd8873a6999f-config\") pod \"dnsmasq-dns-675f4bcbfc-jcz7b\" (UID: \"3e76dc0e-a823-4edf-9f54-fd8873a6999f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jcz7b" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.689420 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzw97\" (UniqueName: \"kubernetes.io/projected/3e76dc0e-a823-4edf-9f54-fd8873a6999f-kube-api-access-rzw97\") pod \"dnsmasq-dns-675f4bcbfc-jcz7b\" (UID: \"3e76dc0e-a823-4edf-9f54-fd8873a6999f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jcz7b" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.771279 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1903e45-4556-48d9-b722-4c5cd7fab11d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-rbdvg\" (UID: \"a1903e45-4556-48d9-b722-4c5cd7fab11d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rbdvg" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.771388 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1903e45-4556-48d9-b722-4c5cd7fab11d-config\") pod \"dnsmasq-dns-78dd6ddcc-rbdvg\" (UID: \"a1903e45-4556-48d9-b722-4c5cd7fab11d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rbdvg" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.771439 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz78w\" (UniqueName: \"kubernetes.io/projected/a1903e45-4556-48d9-b722-4c5cd7fab11d-kube-api-access-xz78w\") pod \"dnsmasq-dns-78dd6ddcc-rbdvg\" (UID: \"a1903e45-4556-48d9-b722-4c5cd7fab11d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rbdvg" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.772617 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1903e45-4556-48d9-b722-4c5cd7fab11d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-rbdvg\" (UID: \"a1903e45-4556-48d9-b722-4c5cd7fab11d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rbdvg" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.773053 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1903e45-4556-48d9-b722-4c5cd7fab11d-config\") pod \"dnsmasq-dns-78dd6ddcc-rbdvg\" (UID: \"a1903e45-4556-48d9-b722-4c5cd7fab11d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rbdvg" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.790628 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz78w\" (UniqueName: \"kubernetes.io/projected/a1903e45-4556-48d9-b722-4c5cd7fab11d-kube-api-access-xz78w\") pod \"dnsmasq-dns-78dd6ddcc-rbdvg\" (UID: \"a1903e45-4556-48d9-b722-4c5cd7fab11d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rbdvg" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.795583 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jcz7b" Mar 18 15:55:17 crc kubenswrapper[4696]: I0318 15:55:17.869412 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-rbdvg" Mar 18 15:55:18 crc kubenswrapper[4696]: I0318 15:55:18.104203 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jcz7b"] Mar 18 15:55:18 crc kubenswrapper[4696]: I0318 15:55:18.258674 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-jcz7b" event={"ID":"3e76dc0e-a823-4edf-9f54-fd8873a6999f","Type":"ContainerStarted","Data":"899826322bfbe0c4c45a424bfee186f068e52c0b579a43dcbc872f0cffdfd733"} Mar 18 15:55:18 crc kubenswrapper[4696]: I0318 15:55:18.408468 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rbdvg"] Mar 18 15:55:18 crc kubenswrapper[4696]: W0318 15:55:18.414693 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1903e45_4556_48d9_b722_4c5cd7fab11d.slice/crio-3f9e03f17f975286300ac44aaa1eb0d5efeb14485d665174492d7ca0426db498 WatchSource:0}: Error finding container 3f9e03f17f975286300ac44aaa1eb0d5efeb14485d665174492d7ca0426db498: Status 404 returned error can't find the container with id 3f9e03f17f975286300ac44aaa1eb0d5efeb14485d665174492d7ca0426db498 Mar 18 15:55:19 crc kubenswrapper[4696]: I0318 15:55:19.269970 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-rbdvg" event={"ID":"a1903e45-4556-48d9-b722-4c5cd7fab11d","Type":"ContainerStarted","Data":"3f9e03f17f975286300ac44aaa1eb0d5efeb14485d665174492d7ca0426db498"} Mar 18 15:55:20 crc kubenswrapper[4696]: I0318 15:55:20.453798 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jcz7b"] Mar 18 15:55:20 crc kubenswrapper[4696]: I0318 15:55:20.485754 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-lggnn"] Mar 18 15:55:20 crc kubenswrapper[4696]: I0318 15:55:20.489290 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-lggnn" Mar 18 15:55:20 crc kubenswrapper[4696]: I0318 15:55:20.498134 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-lggnn"] Mar 18 15:55:20 crc kubenswrapper[4696]: I0318 15:55:20.527237 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/371f70ec-7c03-46b5-9bb9-a346c23aef4a-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-lggnn\" (UID: \"371f70ec-7c03-46b5-9bb9-a346c23aef4a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-lggnn" Mar 18 15:55:20 crc kubenswrapper[4696]: I0318 15:55:20.527403 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/371f70ec-7c03-46b5-9bb9-a346c23aef4a-config\") pod \"dnsmasq-dns-5ccc8479f9-lggnn\" (UID: \"371f70ec-7c03-46b5-9bb9-a346c23aef4a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-lggnn" Mar 18 15:55:20 crc kubenswrapper[4696]: I0318 15:55:20.527462 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fzrh\" (UniqueName: \"kubernetes.io/projected/371f70ec-7c03-46b5-9bb9-a346c23aef4a-kube-api-access-2fzrh\") pod \"dnsmasq-dns-5ccc8479f9-lggnn\" (UID: \"371f70ec-7c03-46b5-9bb9-a346c23aef4a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-lggnn" Mar 18 15:55:20 crc kubenswrapper[4696]: I0318 15:55:20.634670 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/371f70ec-7c03-46b5-9bb9-a346c23aef4a-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-lggnn\" (UID: \"371f70ec-7c03-46b5-9bb9-a346c23aef4a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-lggnn" Mar 18 15:55:20 crc kubenswrapper[4696]: I0318 15:55:20.635145 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/371f70ec-7c03-46b5-9bb9-a346c23aef4a-config\") pod \"dnsmasq-dns-5ccc8479f9-lggnn\" (UID: \"371f70ec-7c03-46b5-9bb9-a346c23aef4a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-lggnn" Mar 18 15:55:20 crc kubenswrapper[4696]: I0318 15:55:20.635219 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fzrh\" (UniqueName: \"kubernetes.io/projected/371f70ec-7c03-46b5-9bb9-a346c23aef4a-kube-api-access-2fzrh\") pod \"dnsmasq-dns-5ccc8479f9-lggnn\" (UID: \"371f70ec-7c03-46b5-9bb9-a346c23aef4a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-lggnn" Mar 18 15:55:20 crc kubenswrapper[4696]: I0318 15:55:20.636477 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/371f70ec-7c03-46b5-9bb9-a346c23aef4a-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-lggnn\" (UID: \"371f70ec-7c03-46b5-9bb9-a346c23aef4a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-lggnn" Mar 18 15:55:20 crc kubenswrapper[4696]: I0318 15:55:20.636689 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/371f70ec-7c03-46b5-9bb9-a346c23aef4a-config\") pod \"dnsmasq-dns-5ccc8479f9-lggnn\" (UID: \"371f70ec-7c03-46b5-9bb9-a346c23aef4a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-lggnn" Mar 18 15:55:20 crc kubenswrapper[4696]: I0318 15:55:20.678811 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fzrh\" (UniqueName: \"kubernetes.io/projected/371f70ec-7c03-46b5-9bb9-a346c23aef4a-kube-api-access-2fzrh\") pod \"dnsmasq-dns-5ccc8479f9-lggnn\" (UID: \"371f70ec-7c03-46b5-9bb9-a346c23aef4a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-lggnn" Mar 18 15:55:20 crc kubenswrapper[4696]: I0318 15:55:20.777052 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rbdvg"] Mar 18 15:55:20 crc kubenswrapper[4696]: I0318 15:55:20.807910 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-th8mn"] Mar 18 15:55:20 crc kubenswrapper[4696]: I0318 15:55:20.809215 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-th8mn" Mar 18 15:55:20 crc kubenswrapper[4696]: I0318 15:55:20.831708 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-th8mn"] Mar 18 15:55:20 crc kubenswrapper[4696]: I0318 15:55:20.835172 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-lggnn" Mar 18 15:55:20 crc kubenswrapper[4696]: I0318 15:55:20.939121 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ddbc33c-8c29-465b-9c61-b9899324afb4-config\") pod \"dnsmasq-dns-57d769cc4f-th8mn\" (UID: \"5ddbc33c-8c29-465b-9c61-b9899324afb4\") " pod="openstack/dnsmasq-dns-57d769cc4f-th8mn" Mar 18 15:55:20 crc kubenswrapper[4696]: I0318 15:55:20.939211 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ddbc33c-8c29-465b-9c61-b9899324afb4-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-th8mn\" (UID: \"5ddbc33c-8c29-465b-9c61-b9899324afb4\") " pod="openstack/dnsmasq-dns-57d769cc4f-th8mn" Mar 18 15:55:20 crc kubenswrapper[4696]: I0318 15:55:20.941357 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcw4p\" (UniqueName: \"kubernetes.io/projected/5ddbc33c-8c29-465b-9c61-b9899324afb4-kube-api-access-wcw4p\") pod \"dnsmasq-dns-57d769cc4f-th8mn\" (UID: \"5ddbc33c-8c29-465b-9c61-b9899324afb4\") " pod="openstack/dnsmasq-dns-57d769cc4f-th8mn" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.042354 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcw4p\" (UniqueName: \"kubernetes.io/projected/5ddbc33c-8c29-465b-9c61-b9899324afb4-kube-api-access-wcw4p\") pod \"dnsmasq-dns-57d769cc4f-th8mn\" (UID: \"5ddbc33c-8c29-465b-9c61-b9899324afb4\") " pod="openstack/dnsmasq-dns-57d769cc4f-th8mn" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.042500 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ddbc33c-8c29-465b-9c61-b9899324afb4-config\") pod \"dnsmasq-dns-57d769cc4f-th8mn\" (UID: \"5ddbc33c-8c29-465b-9c61-b9899324afb4\") " pod="openstack/dnsmasq-dns-57d769cc4f-th8mn" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.042557 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ddbc33c-8c29-465b-9c61-b9899324afb4-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-th8mn\" (UID: \"5ddbc33c-8c29-465b-9c61-b9899324afb4\") " pod="openstack/dnsmasq-dns-57d769cc4f-th8mn" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.043334 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ddbc33c-8c29-465b-9c61-b9899324afb4-config\") pod \"dnsmasq-dns-57d769cc4f-th8mn\" (UID: \"5ddbc33c-8c29-465b-9c61-b9899324afb4\") " pod="openstack/dnsmasq-dns-57d769cc4f-th8mn" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.043461 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ddbc33c-8c29-465b-9c61-b9899324afb4-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-th8mn\" (UID: \"5ddbc33c-8c29-465b-9c61-b9899324afb4\") " pod="openstack/dnsmasq-dns-57d769cc4f-th8mn" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.110063 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcw4p\" (UniqueName: \"kubernetes.io/projected/5ddbc33c-8c29-465b-9c61-b9899324afb4-kube-api-access-wcw4p\") pod \"dnsmasq-dns-57d769cc4f-th8mn\" (UID: \"5ddbc33c-8c29-465b-9c61-b9899324afb4\") " pod="openstack/dnsmasq-dns-57d769cc4f-th8mn" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.130387 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-th8mn" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.534936 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-lggnn"] Mar 18 15:55:21 crc kubenswrapper[4696]: W0318 15:55:21.537196 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod371f70ec_7c03_46b5_9bb9_a346c23aef4a.slice/crio-6854f83723f6664d5be9e48f956241d11db0e4f661387755dee68ab2c07498f6 WatchSource:0}: Error finding container 6854f83723f6664d5be9e48f956241d11db0e4f661387755dee68ab2c07498f6: Status 404 returned error can't find the container with id 6854f83723f6664d5be9e48f956241d11db0e4f661387755dee68ab2c07498f6 Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.661344 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.662946 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.665513 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.666589 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.666710 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.666864 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.667113 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.667280 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mvprf" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.667379 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.667704 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.725462 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-th8mn"] Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.856795 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0bdb4167-8754-4c20-97ea-b014ce2cafdc-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.857137 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff97b\" (UniqueName: \"kubernetes.io/projected/0bdb4167-8754-4c20-97ea-b014ce2cafdc-kube-api-access-ff97b\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.857162 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.857185 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0bdb4167-8754-4c20-97ea-b014ce2cafdc-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.857206 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0bdb4167-8754-4c20-97ea-b014ce2cafdc-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.857223 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0bdb4167-8754-4c20-97ea-b014ce2cafdc-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.857243 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0bdb4167-8754-4c20-97ea-b014ce2cafdc-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.857267 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0bdb4167-8754-4c20-97ea-b014ce2cafdc-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.857282 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0bdb4167-8754-4c20-97ea-b014ce2cafdc-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.857296 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0bdb4167-8754-4c20-97ea-b014ce2cafdc-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.857310 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0bdb4167-8754-4c20-97ea-b014ce2cafdc-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.958674 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0bdb4167-8754-4c20-97ea-b014ce2cafdc-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.958806 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff97b\" (UniqueName: \"kubernetes.io/projected/0bdb4167-8754-4c20-97ea-b014ce2cafdc-kube-api-access-ff97b\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.958841 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.958867 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0bdb4167-8754-4c20-97ea-b014ce2cafdc-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.958931 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0bdb4167-8754-4c20-97ea-b014ce2cafdc-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.958954 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0bdb4167-8754-4c20-97ea-b014ce2cafdc-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.958985 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0bdb4167-8754-4c20-97ea-b014ce2cafdc-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.959014 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0bdb4167-8754-4c20-97ea-b014ce2cafdc-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.959039 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0bdb4167-8754-4c20-97ea-b014ce2cafdc-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.959062 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0bdb4167-8754-4c20-97ea-b014ce2cafdc-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.959083 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0bdb4167-8754-4c20-97ea-b014ce2cafdc-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.959246 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0bdb4167-8754-4c20-97ea-b014ce2cafdc-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.959673 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.965182 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0bdb4167-8754-4c20-97ea-b014ce2cafdc-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.970179 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0bdb4167-8754-4c20-97ea-b014ce2cafdc-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.970819 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0bdb4167-8754-4c20-97ea-b014ce2cafdc-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.971088 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0bdb4167-8754-4c20-97ea-b014ce2cafdc-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.974672 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0bdb4167-8754-4c20-97ea-b014ce2cafdc-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.976156 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0bdb4167-8754-4c20-97ea-b014ce2cafdc-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.976850 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0bdb4167-8754-4c20-97ea-b014ce2cafdc-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.982739 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff97b\" (UniqueName: \"kubernetes.io/projected/0bdb4167-8754-4c20-97ea-b014ce2cafdc-kube-api-access-ff97b\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:21 crc kubenswrapper[4696]: I0318 15:55:21.991344 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0bdb4167-8754-4c20-97ea-b014ce2cafdc-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.003811 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.009464 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.026812 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.028397 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.034292 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.038360 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.038670 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ks66j" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.039269 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.039448 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.040371 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.040711 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.041425 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.170049 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9b5b880f-8efc-483f-b734-fa854ddd30dc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.170123 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9b5b880f-8efc-483f-b734-fa854ddd30dc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.170161 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9b5b880f-8efc-483f-b734-fa854ddd30dc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.170313 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9b5b880f-8efc-483f-b734-fa854ddd30dc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.170365 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnjnk\" (UniqueName: \"kubernetes.io/projected/9b5b880f-8efc-483f-b734-fa854ddd30dc-kube-api-access-jnjnk\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.170417 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9b5b880f-8efc-483f-b734-fa854ddd30dc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.170448 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9b5b880f-8efc-483f-b734-fa854ddd30dc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.170550 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9b5b880f-8efc-483f-b734-fa854ddd30dc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.170604 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b5b880f-8efc-483f-b734-fa854ddd30dc-config-data\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.170655 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9b5b880f-8efc-483f-b734-fa854ddd30dc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.170728 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.271993 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9b5b880f-8efc-483f-b734-fa854ddd30dc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.272579 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9b5b880f-8efc-483f-b734-fa854ddd30dc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.272622 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnjnk\" (UniqueName: \"kubernetes.io/projected/9b5b880f-8efc-483f-b734-fa854ddd30dc-kube-api-access-jnjnk\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.272648 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9b5b880f-8efc-483f-b734-fa854ddd30dc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.272676 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9b5b880f-8efc-483f-b734-fa854ddd30dc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.272700 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9b5b880f-8efc-483f-b734-fa854ddd30dc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.272729 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9b5b880f-8efc-483f-b734-fa854ddd30dc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.272758 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b5b880f-8efc-483f-b734-fa854ddd30dc-config-data\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.272793 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9b5b880f-8efc-483f-b734-fa854ddd30dc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.272830 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.272899 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9b5b880f-8efc-483f-b734-fa854ddd30dc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.274888 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.275279 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9b5b880f-8efc-483f-b734-fa854ddd30dc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.275581 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9b5b880f-8efc-483f-b734-fa854ddd30dc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.275585 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9b5b880f-8efc-483f-b734-fa854ddd30dc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.275807 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9b5b880f-8efc-483f-b734-fa854ddd30dc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.276100 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b5b880f-8efc-483f-b734-fa854ddd30dc-config-data\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.280047 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9b5b880f-8efc-483f-b734-fa854ddd30dc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.280871 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9b5b880f-8efc-483f-b734-fa854ddd30dc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.281796 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9b5b880f-8efc-483f-b734-fa854ddd30dc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.288989 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9b5b880f-8efc-483f-b734-fa854ddd30dc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.298584 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnjnk\" (UniqueName: \"kubernetes.io/projected/9b5b880f-8efc-483f-b734-fa854ddd30dc-kube-api-access-jnjnk\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.304586 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-lggnn" event={"ID":"371f70ec-7c03-46b5-9bb9-a346c23aef4a","Type":"ContainerStarted","Data":"6854f83723f6664d5be9e48f956241d11db0e4f661387755dee68ab2c07498f6"} Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.305706 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-th8mn" event={"ID":"5ddbc33c-8c29-465b-9c61-b9899324afb4","Type":"ContainerStarted","Data":"4c4f83141a4b4054ff2303c756b6eee83941e4bb2a86e7b38f280868f6d30dda"} Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.305911 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.387552 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.750373 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.912897 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.917363 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.925030 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.927961 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.928211 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-5pvrz" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.928591 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.943251 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 15:55:22 crc kubenswrapper[4696]: I0318 15:55:22.948811 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.088645 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dd1c1de5-fda6-4306-bde0-736fd76a8f31-kolla-config\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") " pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.088744 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") " pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.088794 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd1c1de5-fda6-4306-bde0-736fd76a8f31-operator-scripts\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") " pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.088835 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1c1de5-fda6-4306-bde0-736fd76a8f31-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") " pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.088888 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jj22\" (UniqueName: \"kubernetes.io/projected/dd1c1de5-fda6-4306-bde0-736fd76a8f31-kube-api-access-4jj22\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") " pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.088938 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dd1c1de5-fda6-4306-bde0-736fd76a8f31-config-data-generated\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") " pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.088958 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1c1de5-fda6-4306-bde0-736fd76a8f31-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") " pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.089139 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dd1c1de5-fda6-4306-bde0-736fd76a8f31-config-data-default\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") " pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.093337 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.190678 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1c1de5-fda6-4306-bde0-736fd76a8f31-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") " pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.190741 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jj22\" (UniqueName: \"kubernetes.io/projected/dd1c1de5-fda6-4306-bde0-736fd76a8f31-kube-api-access-4jj22\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") " pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.190771 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dd1c1de5-fda6-4306-bde0-736fd76a8f31-config-data-generated\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") " pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.190798 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1c1de5-fda6-4306-bde0-736fd76a8f31-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") " pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.190869 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dd1c1de5-fda6-4306-bde0-736fd76a8f31-config-data-default\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") " pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.190916 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dd1c1de5-fda6-4306-bde0-736fd76a8f31-kolla-config\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") " pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.190946 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") " pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.191261 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dd1c1de5-fda6-4306-bde0-736fd76a8f31-config-data-generated\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") " pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.191598 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.191996 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dd1c1de5-fda6-4306-bde0-736fd76a8f31-kolla-config\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") " pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.192074 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd1c1de5-fda6-4306-bde0-736fd76a8f31-operator-scripts\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") " pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.192131 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dd1c1de5-fda6-4306-bde0-736fd76a8f31-config-data-default\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") " pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.193537 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd1c1de5-fda6-4306-bde0-736fd76a8f31-operator-scripts\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") " pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.197586 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1c1de5-fda6-4306-bde0-736fd76a8f31-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") " pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.248760 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jj22\" (UniqueName: \"kubernetes.io/projected/dd1c1de5-fda6-4306-bde0-736fd76a8f31-kube-api-access-4jj22\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") " pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.249233 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1c1de5-fda6-4306-bde0-736fd76a8f31-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") " pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.276614 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"dd1c1de5-fda6-4306-bde0-736fd76a8f31\") " pod="openstack/openstack-galera-0" Mar 18 15:55:23 crc kubenswrapper[4696]: I0318 15:55:23.554781 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.184850 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.187843 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.192858 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.194579 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.194580 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-7wkvs" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.194753 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.195300 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.315926 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/87f019c6-a59d-4465-8fb8-c47b198c513b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.316010 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f019c6-a59d-4465-8fb8-c47b198c513b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.316081 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvn8m\" (UniqueName: \"kubernetes.io/projected/87f019c6-a59d-4465-8fb8-c47b198c513b-kube-api-access-xvn8m\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.316114 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/87f019c6-a59d-4465-8fb8-c47b198c513b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.316480 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.316784 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87f019c6-a59d-4465-8fb8-c47b198c513b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.316835 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/87f019c6-a59d-4465-8fb8-c47b198c513b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.316865 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/87f019c6-a59d-4465-8fb8-c47b198c513b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.329923 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.330888 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.336481 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-kp5cb" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.341262 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.346821 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.352015 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.422417 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88bcbc43-a512-4f0f-8ce6-e6fd9905df8b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"88bcbc43-a512-4f0f-8ce6-e6fd9905df8b\") " pod="openstack/memcached-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.422471 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/88bcbc43-a512-4f0f-8ce6-e6fd9905df8b-kolla-config\") pod \"memcached-0\" (UID: \"88bcbc43-a512-4f0f-8ce6-e6fd9905df8b\") " pod="openstack/memcached-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.422507 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvts5\" (UniqueName: \"kubernetes.io/projected/88bcbc43-a512-4f0f-8ce6-e6fd9905df8b-kube-api-access-kvts5\") pod \"memcached-0\" (UID: \"88bcbc43-a512-4f0f-8ce6-e6fd9905df8b\") " pod="openstack/memcached-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.422558 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/87f019c6-a59d-4465-8fb8-c47b198c513b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.422588 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/88bcbc43-a512-4f0f-8ce6-e6fd9905df8b-config-data\") pod \"memcached-0\" (UID: \"88bcbc43-a512-4f0f-8ce6-e6fd9905df8b\") " pod="openstack/memcached-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.422610 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f019c6-a59d-4465-8fb8-c47b198c513b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.422651 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvn8m\" (UniqueName: \"kubernetes.io/projected/87f019c6-a59d-4465-8fb8-c47b198c513b-kube-api-access-xvn8m\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.424879 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/87f019c6-a59d-4465-8fb8-c47b198c513b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.426611 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/87f019c6-a59d-4465-8fb8-c47b198c513b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.426789 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.426848 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87f019c6-a59d-4465-8fb8-c47b198c513b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.426894 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/87f019c6-a59d-4465-8fb8-c47b198c513b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.426931 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/88bcbc43-a512-4f0f-8ce6-e6fd9905df8b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"88bcbc43-a512-4f0f-8ce6-e6fd9905df8b\") " pod="openstack/memcached-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.426955 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/87f019c6-a59d-4465-8fb8-c47b198c513b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.427404 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/87f019c6-a59d-4465-8fb8-c47b198c513b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.427468 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/87f019c6-a59d-4465-8fb8-c47b198c513b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.427822 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.428721 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87f019c6-a59d-4465-8fb8-c47b198c513b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.450782 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/87f019c6-a59d-4465-8fb8-c47b198c513b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.460159 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvn8m\" (UniqueName: \"kubernetes.io/projected/87f019c6-a59d-4465-8fb8-c47b198c513b-kube-api-access-xvn8m\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.476182 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f019c6-a59d-4465-8fb8-c47b198c513b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.529796 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/88bcbc43-a512-4f0f-8ce6-e6fd9905df8b-config-data\") pod \"memcached-0\" (UID: \"88bcbc43-a512-4f0f-8ce6-e6fd9905df8b\") " pod="openstack/memcached-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.529919 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/88bcbc43-a512-4f0f-8ce6-e6fd9905df8b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"88bcbc43-a512-4f0f-8ce6-e6fd9905df8b\") " pod="openstack/memcached-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.529977 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88bcbc43-a512-4f0f-8ce6-e6fd9905df8b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"88bcbc43-a512-4f0f-8ce6-e6fd9905df8b\") " pod="openstack/memcached-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.530001 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/88bcbc43-a512-4f0f-8ce6-e6fd9905df8b-kolla-config\") pod \"memcached-0\" (UID: \"88bcbc43-a512-4f0f-8ce6-e6fd9905df8b\") " pod="openstack/memcached-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.530024 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvts5\" (UniqueName: \"kubernetes.io/projected/88bcbc43-a512-4f0f-8ce6-e6fd9905df8b-kube-api-access-kvts5\") pod \"memcached-0\" (UID: \"88bcbc43-a512-4f0f-8ce6-e6fd9905df8b\") " pod="openstack/memcached-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.531234 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/88bcbc43-a512-4f0f-8ce6-e6fd9905df8b-config-data\") pod \"memcached-0\" (UID: \"88bcbc43-a512-4f0f-8ce6-e6fd9905df8b\") " pod="openstack/memcached-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.535236 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/88bcbc43-a512-4f0f-8ce6-e6fd9905df8b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"88bcbc43-a512-4f0f-8ce6-e6fd9905df8b\") " pod="openstack/memcached-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.536021 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/88bcbc43-a512-4f0f-8ce6-e6fd9905df8b-kolla-config\") pod \"memcached-0\" (UID: \"88bcbc43-a512-4f0f-8ce6-e6fd9905df8b\") " pod="openstack/memcached-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.578452 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88bcbc43-a512-4f0f-8ce6-e6fd9905df8b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"88bcbc43-a512-4f0f-8ce6-e6fd9905df8b\") " pod="openstack/memcached-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.594305 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"87f019c6-a59d-4465-8fb8-c47b198c513b\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.596175 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvts5\" (UniqueName: \"kubernetes.io/projected/88bcbc43-a512-4f0f-8ce6-e6fd9905df8b-kube-api-access-kvts5\") pod \"memcached-0\" (UID: \"88bcbc43-a512-4f0f-8ce6-e6fd9905df8b\") " pod="openstack/memcached-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.651041 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 15:55:24 crc kubenswrapper[4696]: I0318 15:55:24.810784 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:26 crc kubenswrapper[4696]: I0318 15:55:26.693308 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 15:55:26 crc kubenswrapper[4696]: I0318 15:55:26.694879 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 15:55:26 crc kubenswrapper[4696]: I0318 15:55:26.698952 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-n9gp9" Mar 18 15:55:26 crc kubenswrapper[4696]: I0318 15:55:26.704654 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 15:55:26 crc kubenswrapper[4696]: I0318 15:55:26.774029 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n96vg\" (UniqueName: \"kubernetes.io/projected/40ba5312-af64-42b5-8f85-cd32aa1dd530-kube-api-access-n96vg\") pod \"kube-state-metrics-0\" (UID: \"40ba5312-af64-42b5-8f85-cd32aa1dd530\") " pod="openstack/kube-state-metrics-0" Mar 18 15:55:26 crc kubenswrapper[4696]: I0318 15:55:26.875500 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n96vg\" (UniqueName: \"kubernetes.io/projected/40ba5312-af64-42b5-8f85-cd32aa1dd530-kube-api-access-n96vg\") pod \"kube-state-metrics-0\" (UID: \"40ba5312-af64-42b5-8f85-cd32aa1dd530\") " pod="openstack/kube-state-metrics-0" Mar 18 15:55:26 crc kubenswrapper[4696]: I0318 15:55:26.894253 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n96vg\" (UniqueName: \"kubernetes.io/projected/40ba5312-af64-42b5-8f85-cd32aa1dd530-kube-api-access-n96vg\") pod \"kube-state-metrics-0\" (UID: \"40ba5312-af64-42b5-8f85-cd32aa1dd530\") " pod="openstack/kube-state-metrics-0" Mar 18 15:55:27 crc kubenswrapper[4696]: I0318 15:55:27.028286 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 15:55:28 crc kubenswrapper[4696]: W0318 15:55:28.155931 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bdb4167_8754_4c20_97ea_b014ce2cafdc.slice/crio-c137c4eb0e12db16624375acf1c743ab9ee4f246665956ce7b75edd7c223a66c WatchSource:0}: Error finding container c137c4eb0e12db16624375acf1c743ab9ee4f246665956ce7b75edd7c223a66c: Status 404 returned error can't find the container with id c137c4eb0e12db16624375acf1c743ab9ee4f246665956ce7b75edd7c223a66c Mar 18 15:55:28 crc kubenswrapper[4696]: I0318 15:55:28.426682 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0bdb4167-8754-4c20-97ea-b014ce2cafdc","Type":"ContainerStarted","Data":"c137c4eb0e12db16624375acf1c743ab9ee4f246665956ce7b75edd7c223a66c"} Mar 18 15:55:28 crc kubenswrapper[4696]: I0318 15:55:28.429315 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9b5b880f-8efc-483f-b734-fa854ddd30dc","Type":"ContainerStarted","Data":"619ff68b53517da3000215b05bbd62bf092feb025f5ae29d9886854eeb792fad"} Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.675757 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.681379 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.681579 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.685110 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.685446 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-k9bmv" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.685588 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.685768 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.685833 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.754388 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8df5e2e0-02fe-4be7-ae7d-f92ea79ce510-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.754457 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8df5e2e0-02fe-4be7-ae7d-f92ea79ce510-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.754568 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8df5e2e0-02fe-4be7-ae7d-f92ea79ce510-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.754625 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8df5e2e0-02fe-4be7-ae7d-f92ea79ce510-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.754688 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8df5e2e0-02fe-4be7-ae7d-f92ea79ce510-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.754714 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8df5e2e0-02fe-4be7-ae7d-f92ea79ce510-config\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.754812 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.754907 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvr4h\" (UniqueName: \"kubernetes.io/projected/8df5e2e0-02fe-4be7-ae7d-f92ea79ce510-kube-api-access-cvr4h\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.856403 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8df5e2e0-02fe-4be7-ae7d-f92ea79ce510-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.856468 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8df5e2e0-02fe-4be7-ae7d-f92ea79ce510-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.856508 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8df5e2e0-02fe-4be7-ae7d-f92ea79ce510-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.856556 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8df5e2e0-02fe-4be7-ae7d-f92ea79ce510-config\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.856594 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.856639 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvr4h\" (UniqueName: \"kubernetes.io/projected/8df5e2e0-02fe-4be7-ae7d-f92ea79ce510-kube-api-access-cvr4h\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.856685 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8df5e2e0-02fe-4be7-ae7d-f92ea79ce510-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.856720 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8df5e2e0-02fe-4be7-ae7d-f92ea79ce510-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.857428 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8df5e2e0-02fe-4be7-ae7d-f92ea79ce510-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.857903 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8df5e2e0-02fe-4be7-ae7d-f92ea79ce510-config\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.857940 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.858007 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8df5e2e0-02fe-4be7-ae7d-f92ea79ce510-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.863743 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8df5e2e0-02fe-4be7-ae7d-f92ea79ce510-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.863788 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8df5e2e0-02fe-4be7-ae7d-f92ea79ce510-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.864217 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8df5e2e0-02fe-4be7-ae7d-f92ea79ce510-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.876755 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvr4h\" (UniqueName: \"kubernetes.io/projected/8df5e2e0-02fe-4be7-ae7d-f92ea79ce510-kube-api-access-cvr4h\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:30 crc kubenswrapper[4696]: I0318 15:55:30.895965 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.007101 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.350277 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vb7xn"] Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.351543 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vb7xn" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.354597 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.354961 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-w7tss" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.356270 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.371707 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vb7xn"] Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.381083 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-x4bkz"] Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.382750 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-x4bkz" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.392418 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-x4bkz"] Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.467398 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a37c7e7f-336d-4e95-b9ea-3750b49d4117-scripts\") pod \"ovn-controller-ovs-x4bkz\" (UID: \"a37c7e7f-336d-4e95-b9ea-3750b49d4117\") " pod="openstack/ovn-controller-ovs-x4bkz" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.467444 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa7f696-eda9-4cd4-953b-0a24e9935290-combined-ca-bundle\") pod \"ovn-controller-vb7xn\" (UID: \"efa7f696-eda9-4cd4-953b-0a24e9935290\") " pod="openstack/ovn-controller-vb7xn" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.467470 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a37c7e7f-336d-4e95-b9ea-3750b49d4117-var-lib\") pod \"ovn-controller-ovs-x4bkz\" (UID: \"a37c7e7f-336d-4e95-b9ea-3750b49d4117\") " pod="openstack/ovn-controller-ovs-x4bkz" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.468057 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv4bm\" (UniqueName: \"kubernetes.io/projected/efa7f696-eda9-4cd4-953b-0a24e9935290-kube-api-access-pv4bm\") pod \"ovn-controller-vb7xn\" (UID: \"efa7f696-eda9-4cd4-953b-0a24e9935290\") " pod="openstack/ovn-controller-vb7xn" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.468237 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a37c7e7f-336d-4e95-b9ea-3750b49d4117-etc-ovs\") pod \"ovn-controller-ovs-x4bkz\" (UID: \"a37c7e7f-336d-4e95-b9ea-3750b49d4117\") " pod="openstack/ovn-controller-ovs-x4bkz" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.468308 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/efa7f696-eda9-4cd4-953b-0a24e9935290-var-log-ovn\") pod \"ovn-controller-vb7xn\" (UID: \"efa7f696-eda9-4cd4-953b-0a24e9935290\") " pod="openstack/ovn-controller-vb7xn" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.468998 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a37c7e7f-336d-4e95-b9ea-3750b49d4117-var-run\") pod \"ovn-controller-ovs-x4bkz\" (UID: \"a37c7e7f-336d-4e95-b9ea-3750b49d4117\") " pod="openstack/ovn-controller-ovs-x4bkz" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.469048 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/efa7f696-eda9-4cd4-953b-0a24e9935290-ovn-controller-tls-certs\") pod \"ovn-controller-vb7xn\" (UID: \"efa7f696-eda9-4cd4-953b-0a24e9935290\") " pod="openstack/ovn-controller-vb7xn" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.469083 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a37c7e7f-336d-4e95-b9ea-3750b49d4117-var-log\") pod \"ovn-controller-ovs-x4bkz\" (UID: \"a37c7e7f-336d-4e95-b9ea-3750b49d4117\") " pod="openstack/ovn-controller-ovs-x4bkz" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.469398 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw8h6\" (UniqueName: \"kubernetes.io/projected/a37c7e7f-336d-4e95-b9ea-3750b49d4117-kube-api-access-lw8h6\") pod \"ovn-controller-ovs-x4bkz\" (UID: \"a37c7e7f-336d-4e95-b9ea-3750b49d4117\") " pod="openstack/ovn-controller-ovs-x4bkz" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.469495 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/efa7f696-eda9-4cd4-953b-0a24e9935290-var-run\") pod \"ovn-controller-vb7xn\" (UID: \"efa7f696-eda9-4cd4-953b-0a24e9935290\") " pod="openstack/ovn-controller-vb7xn" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.469587 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/efa7f696-eda9-4cd4-953b-0a24e9935290-scripts\") pod \"ovn-controller-vb7xn\" (UID: \"efa7f696-eda9-4cd4-953b-0a24e9935290\") " pod="openstack/ovn-controller-vb7xn" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.470511 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/efa7f696-eda9-4cd4-953b-0a24e9935290-var-run-ovn\") pod \"ovn-controller-vb7xn\" (UID: \"efa7f696-eda9-4cd4-953b-0a24e9935290\") " pod="openstack/ovn-controller-vb7xn" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.572484 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a37c7e7f-336d-4e95-b9ea-3750b49d4117-scripts\") pod \"ovn-controller-ovs-x4bkz\" (UID: \"a37c7e7f-336d-4e95-b9ea-3750b49d4117\") " pod="openstack/ovn-controller-ovs-x4bkz" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.573025 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa7f696-eda9-4cd4-953b-0a24e9935290-combined-ca-bundle\") pod \"ovn-controller-vb7xn\" (UID: \"efa7f696-eda9-4cd4-953b-0a24e9935290\") " pod="openstack/ovn-controller-vb7xn" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.573052 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a37c7e7f-336d-4e95-b9ea-3750b49d4117-var-lib\") pod \"ovn-controller-ovs-x4bkz\" (UID: \"a37c7e7f-336d-4e95-b9ea-3750b49d4117\") " pod="openstack/ovn-controller-ovs-x4bkz" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.573081 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv4bm\" (UniqueName: \"kubernetes.io/projected/efa7f696-eda9-4cd4-953b-0a24e9935290-kube-api-access-pv4bm\") pod \"ovn-controller-vb7xn\" (UID: \"efa7f696-eda9-4cd4-953b-0a24e9935290\") " pod="openstack/ovn-controller-vb7xn" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.573118 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a37c7e7f-336d-4e95-b9ea-3750b49d4117-etc-ovs\") pod \"ovn-controller-ovs-x4bkz\" (UID: \"a37c7e7f-336d-4e95-b9ea-3750b49d4117\") " pod="openstack/ovn-controller-ovs-x4bkz" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.573139 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/efa7f696-eda9-4cd4-953b-0a24e9935290-var-log-ovn\") pod \"ovn-controller-vb7xn\" (UID: \"efa7f696-eda9-4cd4-953b-0a24e9935290\") " pod="openstack/ovn-controller-vb7xn" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.573183 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a37c7e7f-336d-4e95-b9ea-3750b49d4117-var-run\") pod \"ovn-controller-ovs-x4bkz\" (UID: \"a37c7e7f-336d-4e95-b9ea-3750b49d4117\") " pod="openstack/ovn-controller-ovs-x4bkz" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.573200 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/efa7f696-eda9-4cd4-953b-0a24e9935290-ovn-controller-tls-certs\") pod \"ovn-controller-vb7xn\" (UID: \"efa7f696-eda9-4cd4-953b-0a24e9935290\") " pod="openstack/ovn-controller-vb7xn" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.573217 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a37c7e7f-336d-4e95-b9ea-3750b49d4117-var-log\") pod \"ovn-controller-ovs-x4bkz\" (UID: \"a37c7e7f-336d-4e95-b9ea-3750b49d4117\") " pod="openstack/ovn-controller-ovs-x4bkz" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.573250 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw8h6\" (UniqueName: \"kubernetes.io/projected/a37c7e7f-336d-4e95-b9ea-3750b49d4117-kube-api-access-lw8h6\") pod \"ovn-controller-ovs-x4bkz\" (UID: \"a37c7e7f-336d-4e95-b9ea-3750b49d4117\") " pod="openstack/ovn-controller-ovs-x4bkz" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.573281 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/efa7f696-eda9-4cd4-953b-0a24e9935290-var-run\") pod \"ovn-controller-vb7xn\" (UID: \"efa7f696-eda9-4cd4-953b-0a24e9935290\") " pod="openstack/ovn-controller-vb7xn" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.573307 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/efa7f696-eda9-4cd4-953b-0a24e9935290-scripts\") pod \"ovn-controller-vb7xn\" (UID: \"efa7f696-eda9-4cd4-953b-0a24e9935290\") " pod="openstack/ovn-controller-vb7xn" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.573343 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/efa7f696-eda9-4cd4-953b-0a24e9935290-var-run-ovn\") pod \"ovn-controller-vb7xn\" (UID: \"efa7f696-eda9-4cd4-953b-0a24e9935290\") " pod="openstack/ovn-controller-vb7xn" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.573924 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/efa7f696-eda9-4cd4-953b-0a24e9935290-var-log-ovn\") pod \"ovn-controller-vb7xn\" (UID: \"efa7f696-eda9-4cd4-953b-0a24e9935290\") " pod="openstack/ovn-controller-vb7xn" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.574147 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a37c7e7f-336d-4e95-b9ea-3750b49d4117-etc-ovs\") pod \"ovn-controller-ovs-x4bkz\" (UID: \"a37c7e7f-336d-4e95-b9ea-3750b49d4117\") " pod="openstack/ovn-controller-ovs-x4bkz" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.574345 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a37c7e7f-336d-4e95-b9ea-3750b49d4117-var-lib\") pod \"ovn-controller-ovs-x4bkz\" (UID: \"a37c7e7f-336d-4e95-b9ea-3750b49d4117\") " pod="openstack/ovn-controller-ovs-x4bkz" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.574697 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/efa7f696-eda9-4cd4-953b-0a24e9935290-var-run-ovn\") pod \"ovn-controller-vb7xn\" (UID: \"efa7f696-eda9-4cd4-953b-0a24e9935290\") " pod="openstack/ovn-controller-vb7xn" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.575099 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/efa7f696-eda9-4cd4-953b-0a24e9935290-var-run\") pod \"ovn-controller-vb7xn\" (UID: \"efa7f696-eda9-4cd4-953b-0a24e9935290\") " pod="openstack/ovn-controller-vb7xn" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.575192 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a37c7e7f-336d-4e95-b9ea-3750b49d4117-var-log\") pod \"ovn-controller-ovs-x4bkz\" (UID: \"a37c7e7f-336d-4e95-b9ea-3750b49d4117\") " pod="openstack/ovn-controller-ovs-x4bkz" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.575513 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a37c7e7f-336d-4e95-b9ea-3750b49d4117-scripts\") pod \"ovn-controller-ovs-x4bkz\" (UID: \"a37c7e7f-336d-4e95-b9ea-3750b49d4117\") " pod="openstack/ovn-controller-ovs-x4bkz" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.576003 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/efa7f696-eda9-4cd4-953b-0a24e9935290-scripts\") pod \"ovn-controller-vb7xn\" (UID: \"efa7f696-eda9-4cd4-953b-0a24e9935290\") " pod="openstack/ovn-controller-vb7xn" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.577200 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a37c7e7f-336d-4e95-b9ea-3750b49d4117-var-run\") pod \"ovn-controller-ovs-x4bkz\" (UID: \"a37c7e7f-336d-4e95-b9ea-3750b49d4117\") " pod="openstack/ovn-controller-ovs-x4bkz" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.577660 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/efa7f696-eda9-4cd4-953b-0a24e9935290-ovn-controller-tls-certs\") pod \"ovn-controller-vb7xn\" (UID: \"efa7f696-eda9-4cd4-953b-0a24e9935290\") " pod="openstack/ovn-controller-vb7xn" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.584939 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa7f696-eda9-4cd4-953b-0a24e9935290-combined-ca-bundle\") pod \"ovn-controller-vb7xn\" (UID: \"efa7f696-eda9-4cd4-953b-0a24e9935290\") " pod="openstack/ovn-controller-vb7xn" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.594545 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv4bm\" (UniqueName: \"kubernetes.io/projected/efa7f696-eda9-4cd4-953b-0a24e9935290-kube-api-access-pv4bm\") pod \"ovn-controller-vb7xn\" (UID: \"efa7f696-eda9-4cd4-953b-0a24e9935290\") " pod="openstack/ovn-controller-vb7xn" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.596939 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw8h6\" (UniqueName: \"kubernetes.io/projected/a37c7e7f-336d-4e95-b9ea-3750b49d4117-kube-api-access-lw8h6\") pod \"ovn-controller-ovs-x4bkz\" (UID: \"a37c7e7f-336d-4e95-b9ea-3750b49d4117\") " pod="openstack/ovn-controller-ovs-x4bkz" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.678943 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vb7xn" Mar 18 15:55:31 crc kubenswrapper[4696]: I0318 15:55:31.701730 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-x4bkz" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.262269 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.266030 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.268870 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-4b5j9" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.269365 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.276798 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.277859 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.283487 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.420048 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/50019b99-f0df-4582-ab2a-49f761bc0aa7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.420121 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/50019b99-f0df-4582-ab2a-49f761bc0aa7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.420175 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xntt2\" (UniqueName: \"kubernetes.io/projected/50019b99-f0df-4582-ab2a-49f761bc0aa7-kube-api-access-xntt2\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.420196 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50019b99-f0df-4582-ab2a-49f761bc0aa7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.420224 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/50019b99-f0df-4582-ab2a-49f761bc0aa7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.420390 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/50019b99-f0df-4582-ab2a-49f761bc0aa7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.420468 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.420567 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50019b99-f0df-4582-ab2a-49f761bc0aa7-config\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.522260 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/50019b99-f0df-4582-ab2a-49f761bc0aa7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.522338 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xntt2\" (UniqueName: \"kubernetes.io/projected/50019b99-f0df-4582-ab2a-49f761bc0aa7-kube-api-access-xntt2\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.522359 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50019b99-f0df-4582-ab2a-49f761bc0aa7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.522386 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/50019b99-f0df-4582-ab2a-49f761bc0aa7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.522544 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/50019b99-f0df-4582-ab2a-49f761bc0aa7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.522814 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/50019b99-f0df-4582-ab2a-49f761bc0aa7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.523102 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.523174 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.523215 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50019b99-f0df-4582-ab2a-49f761bc0aa7-config\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.523243 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/50019b99-f0df-4582-ab2a-49f761bc0aa7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.523706 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/50019b99-f0df-4582-ab2a-49f761bc0aa7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.524420 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50019b99-f0df-4582-ab2a-49f761bc0aa7-config\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.531094 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50019b99-f0df-4582-ab2a-49f761bc0aa7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.533432 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/50019b99-f0df-4582-ab2a-49f761bc0aa7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.542014 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/50019b99-f0df-4582-ab2a-49f761bc0aa7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.543989 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xntt2\" (UniqueName: \"kubernetes.io/projected/50019b99-f0df-4582-ab2a-49f761bc0aa7-kube-api-access-xntt2\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.557279 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"50019b99-f0df-4582-ab2a-49f761bc0aa7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:33 crc kubenswrapper[4696]: I0318 15:55:33.602030 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:38 crc kubenswrapper[4696]: I0318 15:55:38.120462 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 15:55:38 crc kubenswrapper[4696]: E0318 15:55:38.830248 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 15:55:38 crc kubenswrapper[4696]: E0318 15:55:38.830897 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wcw4p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-th8mn_openstack(5ddbc33c-8c29-465b-9c61-b9899324afb4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:55:38 crc kubenswrapper[4696]: E0318 15:55:38.832182 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-th8mn" podUID="5ddbc33c-8c29-465b-9c61-b9899324afb4" Mar 18 15:55:38 crc kubenswrapper[4696]: E0318 15:55:38.847100 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 15:55:38 crc kubenswrapper[4696]: E0318 15:55:38.847387 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2fzrh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-lggnn_openstack(371f70ec-7c03-46b5-9bb9-a346c23aef4a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:55:38 crc kubenswrapper[4696]: E0318 15:55:38.848602 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-lggnn" podUID="371f70ec-7c03-46b5-9bb9-a346c23aef4a" Mar 18 15:55:38 crc kubenswrapper[4696]: E0318 15:55:38.850598 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 15:55:38 crc kubenswrapper[4696]: E0318 15:55:38.850840 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rzw97,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-jcz7b_openstack(3e76dc0e-a823-4edf-9f54-fd8873a6999f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:55:38 crc kubenswrapper[4696]: E0318 15:55:38.851961 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-jcz7b" podUID="3e76dc0e-a823-4edf-9f54-fd8873a6999f" Mar 18 15:55:38 crc kubenswrapper[4696]: E0318 15:55:38.855912 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 15:55:38 crc kubenswrapper[4696]: E0318 15:55:38.856151 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xz78w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-rbdvg_openstack(a1903e45-4556-48d9-b722-4c5cd7fab11d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:55:38 crc kubenswrapper[4696]: E0318 15:55:38.857368 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-rbdvg" podUID="a1903e45-4556-48d9-b722-4c5cd7fab11d" Mar 18 15:55:39 crc kubenswrapper[4696]: E0318 15:55:39.526500 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-lggnn" podUID="371f70ec-7c03-46b5-9bb9-a346c23aef4a" Mar 18 15:55:39 crc kubenswrapper[4696]: E0318 15:55:39.528443 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-th8mn" podUID="5ddbc33c-8c29-465b-9c61-b9899324afb4" Mar 18 15:55:40 crc kubenswrapper[4696]: W0318 15:55:40.207788 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd1c1de5_fda6_4306_bde0_736fd76a8f31.slice/crio-fb007d0ba9cc855047922f7cf7f3e15dfbe9a09e46145af7fe843d42a35561c0 WatchSource:0}: Error finding container fb007d0ba9cc855047922f7cf7f3e15dfbe9a09e46145af7fe843d42a35561c0: Status 404 returned error can't find the container with id fb007d0ba9cc855047922f7cf7f3e15dfbe9a09e46145af7fe843d42a35561c0 Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.416200 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jcz7b" Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.444360 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-rbdvg" Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.533750 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-jcz7b" event={"ID":"3e76dc0e-a823-4edf-9f54-fd8873a6999f","Type":"ContainerDied","Data":"899826322bfbe0c4c45a424bfee186f068e52c0b579a43dcbc872f0cffdfd733"} Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.533833 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jcz7b" Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.539356 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-rbdvg" Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.539383 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-rbdvg" event={"ID":"a1903e45-4556-48d9-b722-4c5cd7fab11d","Type":"ContainerDied","Data":"3f9e03f17f975286300ac44aaa1eb0d5efeb14485d665174492d7ca0426db498"} Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.540729 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dd1c1de5-fda6-4306-bde0-736fd76a8f31","Type":"ContainerStarted","Data":"fb007d0ba9cc855047922f7cf7f3e15dfbe9a09e46145af7fe843d42a35561c0"} Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.544495 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e76dc0e-a823-4edf-9f54-fd8873a6999f-config\") pod \"3e76dc0e-a823-4edf-9f54-fd8873a6999f\" (UID: \"3e76dc0e-a823-4edf-9f54-fd8873a6999f\") " Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.544576 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1903e45-4556-48d9-b722-4c5cd7fab11d-dns-svc\") pod \"a1903e45-4556-48d9-b722-4c5cd7fab11d\" (UID: \"a1903e45-4556-48d9-b722-4c5cd7fab11d\") " Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.544679 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzw97\" (UniqueName: \"kubernetes.io/projected/3e76dc0e-a823-4edf-9f54-fd8873a6999f-kube-api-access-rzw97\") pod \"3e76dc0e-a823-4edf-9f54-fd8873a6999f\" (UID: \"3e76dc0e-a823-4edf-9f54-fd8873a6999f\") " Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.544746 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz78w\" (UniqueName: \"kubernetes.io/projected/a1903e45-4556-48d9-b722-4c5cd7fab11d-kube-api-access-xz78w\") pod \"a1903e45-4556-48d9-b722-4c5cd7fab11d\" (UID: \"a1903e45-4556-48d9-b722-4c5cd7fab11d\") " Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.544771 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1903e45-4556-48d9-b722-4c5cd7fab11d-config\") pod \"a1903e45-4556-48d9-b722-4c5cd7fab11d\" (UID: \"a1903e45-4556-48d9-b722-4c5cd7fab11d\") " Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.545221 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e76dc0e-a823-4edf-9f54-fd8873a6999f-config" (OuterVolumeSpecName: "config") pod "3e76dc0e-a823-4edf-9f54-fd8873a6999f" (UID: "3e76dc0e-a823-4edf-9f54-fd8873a6999f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.545473 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1903e45-4556-48d9-b722-4c5cd7fab11d-config" (OuterVolumeSpecName: "config") pod "a1903e45-4556-48d9-b722-4c5cd7fab11d" (UID: "a1903e45-4556-48d9-b722-4c5cd7fab11d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.546215 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1903e45-4556-48d9-b722-4c5cd7fab11d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a1903e45-4556-48d9-b722-4c5cd7fab11d" (UID: "a1903e45-4556-48d9-b722-4c5cd7fab11d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.550724 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e76dc0e-a823-4edf-9f54-fd8873a6999f-kube-api-access-rzw97" (OuterVolumeSpecName: "kube-api-access-rzw97") pod "3e76dc0e-a823-4edf-9f54-fd8873a6999f" (UID: "3e76dc0e-a823-4edf-9f54-fd8873a6999f"). InnerVolumeSpecName "kube-api-access-rzw97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.552578 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1903e45-4556-48d9-b722-4c5cd7fab11d-kube-api-access-xz78w" (OuterVolumeSpecName: "kube-api-access-xz78w") pod "a1903e45-4556-48d9-b722-4c5cd7fab11d" (UID: "a1903e45-4556-48d9-b722-4c5cd7fab11d"). InnerVolumeSpecName "kube-api-access-xz78w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.647054 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e76dc0e-a823-4edf-9f54-fd8873a6999f-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.647092 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1903e45-4556-48d9-b722-4c5cd7fab11d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.647105 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzw97\" (UniqueName: \"kubernetes.io/projected/3e76dc0e-a823-4edf-9f54-fd8873a6999f-kube-api-access-rzw97\") on node \"crc\" DevicePath \"\"" Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.647199 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz78w\" (UniqueName: \"kubernetes.io/projected/a1903e45-4556-48d9-b722-4c5cd7fab11d-kube-api-access-xz78w\") on node \"crc\" DevicePath \"\"" Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.647211 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1903e45-4556-48d9-b722-4c5cd7fab11d-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.727106 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 15:55:40 crc kubenswrapper[4696]: W0318 15:55:40.732104 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88bcbc43_a512_4f0f_8ce6_e6fd9905df8b.slice/crio-25178eaa3cf87dcfc6bccf6b9714f144fd592827ef37fc06cdf5c58cf71adb69 WatchSource:0}: Error finding container 25178eaa3cf87dcfc6bccf6b9714f144fd592827ef37fc06cdf5c58cf71adb69: Status 404 returned error can't find the container with id 25178eaa3cf87dcfc6bccf6b9714f144fd592827ef37fc06cdf5c58cf71adb69 Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.854205 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.860027 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vb7xn"] Mar 18 15:55:40 crc kubenswrapper[4696]: W0318 15:55:40.867189 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40ba5312_af64_42b5_8f85_cd32aa1dd530.slice/crio-33bc8c6a97d220f2ed2f335b0f03769321e0139e0f9e8d0b5cb7a6ff1c8759c5 WatchSource:0}: Error finding container 33bc8c6a97d220f2ed2f335b0f03769321e0139e0f9e8d0b5cb7a6ff1c8759c5: Status 404 returned error can't find the container with id 33bc8c6a97d220f2ed2f335b0f03769321e0139e0f9e8d0b5cb7a6ff1c8759c5 Mar 18 15:55:40 crc kubenswrapper[4696]: W0318 15:55:40.869714 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87f019c6_a59d_4465_8fb8_c47b198c513b.slice/crio-14b08fb0819b63fde4f9fe1ae9a61a9a32942f2e7e7f65051350bc07775f67e5 WatchSource:0}: Error finding container 14b08fb0819b63fde4f9fe1ae9a61a9a32942f2e7e7f65051350bc07775f67e5: Status 404 returned error can't find the container with id 14b08fb0819b63fde4f9fe1ae9a61a9a32942f2e7e7f65051350bc07775f67e5 Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.870610 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.923278 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jcz7b"] Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.939622 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jcz7b"] Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.958386 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rbdvg"] Mar 18 15:55:40 crc kubenswrapper[4696]: I0318 15:55:40.975007 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rbdvg"] Mar 18 15:55:41 crc kubenswrapper[4696]: I0318 15:55:41.081684 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 15:55:41 crc kubenswrapper[4696]: W0318 15:55:41.087075 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8df5e2e0_02fe_4be7_ae7d_f92ea79ce510.slice/crio-d453184484dfdc653db49920161d74f2a39cf6f87506cb8a0f115fbff34b2e06 WatchSource:0}: Error finding container d453184484dfdc653db49920161d74f2a39cf6f87506cb8a0f115fbff34b2e06: Status 404 returned error can't find the container with id d453184484dfdc653db49920161d74f2a39cf6f87506cb8a0f115fbff34b2e06 Mar 18 15:55:41 crc kubenswrapper[4696]: I0318 15:55:41.200653 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 15:55:41 crc kubenswrapper[4696]: W0318 15:55:41.206451 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50019b99_f0df_4582_ab2a_49f761bc0aa7.slice/crio-857dadd8e250be9dbbdeb3d324b0a356915d04bea1cb7ed66e6e9f7bdd8044fc WatchSource:0}: Error finding container 857dadd8e250be9dbbdeb3d324b0a356915d04bea1cb7ed66e6e9f7bdd8044fc: Status 404 returned error can't find the container with id 857dadd8e250be9dbbdeb3d324b0a356915d04bea1cb7ed66e6e9f7bdd8044fc Mar 18 15:55:41 crc kubenswrapper[4696]: I0318 15:55:41.550496 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"50019b99-f0df-4582-ab2a-49f761bc0aa7","Type":"ContainerStarted","Data":"857dadd8e250be9dbbdeb3d324b0a356915d04bea1cb7ed66e6e9f7bdd8044fc"} Mar 18 15:55:41 crc kubenswrapper[4696]: I0318 15:55:41.553534 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510","Type":"ContainerStarted","Data":"d453184484dfdc653db49920161d74f2a39cf6f87506cb8a0f115fbff34b2e06"} Mar 18 15:55:41 crc kubenswrapper[4696]: I0318 15:55:41.555938 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"88bcbc43-a512-4f0f-8ce6-e6fd9905df8b","Type":"ContainerStarted","Data":"25178eaa3cf87dcfc6bccf6b9714f144fd592827ef37fc06cdf5c58cf71adb69"} Mar 18 15:55:41 crc kubenswrapper[4696]: I0318 15:55:41.557505 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vb7xn" event={"ID":"efa7f696-eda9-4cd4-953b-0a24e9935290","Type":"ContainerStarted","Data":"e9f5c3d9d16bc3416086c774a686f45ac204fabe89b0d58dbe81ffc1b9c7578e"} Mar 18 15:55:41 crc kubenswrapper[4696]: I0318 15:55:41.559541 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"40ba5312-af64-42b5-8f85-cd32aa1dd530","Type":"ContainerStarted","Data":"33bc8c6a97d220f2ed2f335b0f03769321e0139e0f9e8d0b5cb7a6ff1c8759c5"} Mar 18 15:55:41 crc kubenswrapper[4696]: I0318 15:55:41.560779 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"87f019c6-a59d-4465-8fb8-c47b198c513b","Type":"ContainerStarted","Data":"14b08fb0819b63fde4f9fe1ae9a61a9a32942f2e7e7f65051350bc07775f67e5"} Mar 18 15:55:41 crc kubenswrapper[4696]: I0318 15:55:41.610172 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e76dc0e-a823-4edf-9f54-fd8873a6999f" path="/var/lib/kubelet/pods/3e76dc0e-a823-4edf-9f54-fd8873a6999f/volumes" Mar 18 15:55:41 crc kubenswrapper[4696]: I0318 15:55:41.610631 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1903e45-4556-48d9-b722-4c5cd7fab11d" path="/var/lib/kubelet/pods/a1903e45-4556-48d9-b722-4c5cd7fab11d/volumes" Mar 18 15:55:42 crc kubenswrapper[4696]: I0318 15:55:42.070737 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-x4bkz"] Mar 18 15:55:42 crc kubenswrapper[4696]: I0318 15:55:42.184118 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:55:42 crc kubenswrapper[4696]: I0318 15:55:42.184381 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:55:42 crc kubenswrapper[4696]: I0318 15:55:42.570228 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-x4bkz" event={"ID":"a37c7e7f-336d-4e95-b9ea-3750b49d4117","Type":"ContainerStarted","Data":"52048249a6660205361f98a628c7715ef4032eb81860db81e7cab1c728a19fea"} Mar 18 15:55:42 crc kubenswrapper[4696]: I0318 15:55:42.571904 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0bdb4167-8754-4c20-97ea-b014ce2cafdc","Type":"ContainerStarted","Data":"15558694b84554c2673508963769937e6f6ef6935177a9200417d184a7df8a9d"} Mar 18 15:55:42 crc kubenswrapper[4696]: I0318 15:55:42.573867 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9b5b880f-8efc-483f-b734-fa854ddd30dc","Type":"ContainerStarted","Data":"c0fd7f944f641aa39a971d00f949ab3c23bfbfb18ecce7eea052a2f01e079a00"} Mar 18 15:55:42 crc kubenswrapper[4696]: I0318 15:55:42.992559 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-hb4gt"] Mar 18 15:55:42 crc kubenswrapper[4696]: I0318 15:55:42.993861 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hb4gt" Mar 18 15:55:42 crc kubenswrapper[4696]: I0318 15:55:42.997303 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.016864 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hb4gt"] Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.098715 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfjgm\" (UniqueName: \"kubernetes.io/projected/0b77b78e-7226-4d19-a9b7-190ad5248eb7-kube-api-access-bfjgm\") pod \"ovn-controller-metrics-hb4gt\" (UID: \"0b77b78e-7226-4d19-a9b7-190ad5248eb7\") " pod="openstack/ovn-controller-metrics-hb4gt" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.099142 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b77b78e-7226-4d19-a9b7-190ad5248eb7-combined-ca-bundle\") pod \"ovn-controller-metrics-hb4gt\" (UID: \"0b77b78e-7226-4d19-a9b7-190ad5248eb7\") " pod="openstack/ovn-controller-metrics-hb4gt" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.099166 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b77b78e-7226-4d19-a9b7-190ad5248eb7-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hb4gt\" (UID: \"0b77b78e-7226-4d19-a9b7-190ad5248eb7\") " pod="openstack/ovn-controller-metrics-hb4gt" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.099206 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b77b78e-7226-4d19-a9b7-190ad5248eb7-config\") pod \"ovn-controller-metrics-hb4gt\" (UID: \"0b77b78e-7226-4d19-a9b7-190ad5248eb7\") " pod="openstack/ovn-controller-metrics-hb4gt" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.099238 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0b77b78e-7226-4d19-a9b7-190ad5248eb7-ovs-rundir\") pod \"ovn-controller-metrics-hb4gt\" (UID: \"0b77b78e-7226-4d19-a9b7-190ad5248eb7\") " pod="openstack/ovn-controller-metrics-hb4gt" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.099274 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0b77b78e-7226-4d19-a9b7-190ad5248eb7-ovn-rundir\") pod \"ovn-controller-metrics-hb4gt\" (UID: \"0b77b78e-7226-4d19-a9b7-190ad5248eb7\") " pod="openstack/ovn-controller-metrics-hb4gt" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.175893 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-th8mn"] Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.203601 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfjgm\" (UniqueName: \"kubernetes.io/projected/0b77b78e-7226-4d19-a9b7-190ad5248eb7-kube-api-access-bfjgm\") pod \"ovn-controller-metrics-hb4gt\" (UID: \"0b77b78e-7226-4d19-a9b7-190ad5248eb7\") " pod="openstack/ovn-controller-metrics-hb4gt" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.203658 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b77b78e-7226-4d19-a9b7-190ad5248eb7-combined-ca-bundle\") pod \"ovn-controller-metrics-hb4gt\" (UID: \"0b77b78e-7226-4d19-a9b7-190ad5248eb7\") " pod="openstack/ovn-controller-metrics-hb4gt" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.203687 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b77b78e-7226-4d19-a9b7-190ad5248eb7-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hb4gt\" (UID: \"0b77b78e-7226-4d19-a9b7-190ad5248eb7\") " pod="openstack/ovn-controller-metrics-hb4gt" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.203732 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b77b78e-7226-4d19-a9b7-190ad5248eb7-config\") pod \"ovn-controller-metrics-hb4gt\" (UID: \"0b77b78e-7226-4d19-a9b7-190ad5248eb7\") " pod="openstack/ovn-controller-metrics-hb4gt" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.203766 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0b77b78e-7226-4d19-a9b7-190ad5248eb7-ovs-rundir\") pod \"ovn-controller-metrics-hb4gt\" (UID: \"0b77b78e-7226-4d19-a9b7-190ad5248eb7\") " pod="openstack/ovn-controller-metrics-hb4gt" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.203807 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0b77b78e-7226-4d19-a9b7-190ad5248eb7-ovn-rundir\") pod \"ovn-controller-metrics-hb4gt\" (UID: \"0b77b78e-7226-4d19-a9b7-190ad5248eb7\") " pod="openstack/ovn-controller-metrics-hb4gt" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.204262 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0b77b78e-7226-4d19-a9b7-190ad5248eb7-ovn-rundir\") pod \"ovn-controller-metrics-hb4gt\" (UID: \"0b77b78e-7226-4d19-a9b7-190ad5248eb7\") " pod="openstack/ovn-controller-metrics-hb4gt" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.204360 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0b77b78e-7226-4d19-a9b7-190ad5248eb7-ovs-rundir\") pod \"ovn-controller-metrics-hb4gt\" (UID: \"0b77b78e-7226-4d19-a9b7-190ad5248eb7\") " pod="openstack/ovn-controller-metrics-hb4gt" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.205801 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b77b78e-7226-4d19-a9b7-190ad5248eb7-config\") pod \"ovn-controller-metrics-hb4gt\" (UID: \"0b77b78e-7226-4d19-a9b7-190ad5248eb7\") " pod="openstack/ovn-controller-metrics-hb4gt" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.222282 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b77b78e-7226-4d19-a9b7-190ad5248eb7-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hb4gt\" (UID: \"0b77b78e-7226-4d19-a9b7-190ad5248eb7\") " pod="openstack/ovn-controller-metrics-hb4gt" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.223472 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b77b78e-7226-4d19-a9b7-190ad5248eb7-combined-ca-bundle\") pod \"ovn-controller-metrics-hb4gt\" (UID: \"0b77b78e-7226-4d19-a9b7-190ad5248eb7\") " pod="openstack/ovn-controller-metrics-hb4gt" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.231230 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfjgm\" (UniqueName: \"kubernetes.io/projected/0b77b78e-7226-4d19-a9b7-190ad5248eb7-kube-api-access-bfjgm\") pod \"ovn-controller-metrics-hb4gt\" (UID: \"0b77b78e-7226-4d19-a9b7-190ad5248eb7\") " pod="openstack/ovn-controller-metrics-hb4gt" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.263559 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-wvlfj"] Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.265264 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.269673 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.270545 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-wvlfj"] Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.304913 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5df59239-94df-44ef-9b1c-2749bf22e7d5-config\") pod \"dnsmasq-dns-5bf47b49b7-wvlfj\" (UID: \"5df59239-94df-44ef-9b1c-2749bf22e7d5\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.305016 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5df59239-94df-44ef-9b1c-2749bf22e7d5-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-wvlfj\" (UID: \"5df59239-94df-44ef-9b1c-2749bf22e7d5\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.305060 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5rnb\" (UniqueName: \"kubernetes.io/projected/5df59239-94df-44ef-9b1c-2749bf22e7d5-kube-api-access-l5rnb\") pod \"dnsmasq-dns-5bf47b49b7-wvlfj\" (UID: \"5df59239-94df-44ef-9b1c-2749bf22e7d5\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.305084 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5df59239-94df-44ef-9b1c-2749bf22e7d5-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-wvlfj\" (UID: \"5df59239-94df-44ef-9b1c-2749bf22e7d5\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.313740 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hb4gt" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.384999 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-lggnn"] Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.406741 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5df59239-94df-44ef-9b1c-2749bf22e7d5-config\") pod \"dnsmasq-dns-5bf47b49b7-wvlfj\" (UID: \"5df59239-94df-44ef-9b1c-2749bf22e7d5\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.406848 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5df59239-94df-44ef-9b1c-2749bf22e7d5-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-wvlfj\" (UID: \"5df59239-94df-44ef-9b1c-2749bf22e7d5\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.406906 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5rnb\" (UniqueName: \"kubernetes.io/projected/5df59239-94df-44ef-9b1c-2749bf22e7d5-kube-api-access-l5rnb\") pod \"dnsmasq-dns-5bf47b49b7-wvlfj\" (UID: \"5df59239-94df-44ef-9b1c-2749bf22e7d5\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.406934 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5df59239-94df-44ef-9b1c-2749bf22e7d5-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-wvlfj\" (UID: \"5df59239-94df-44ef-9b1c-2749bf22e7d5\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.407797 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5df59239-94df-44ef-9b1c-2749bf22e7d5-config\") pod \"dnsmasq-dns-5bf47b49b7-wvlfj\" (UID: \"5df59239-94df-44ef-9b1c-2749bf22e7d5\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.408551 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5df59239-94df-44ef-9b1c-2749bf22e7d5-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-wvlfj\" (UID: \"5df59239-94df-44ef-9b1c-2749bf22e7d5\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.410952 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5df59239-94df-44ef-9b1c-2749bf22e7d5-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-wvlfj\" (UID: \"5df59239-94df-44ef-9b1c-2749bf22e7d5\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.434884 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-lfms6"] Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.436414 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-lfms6" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.441883 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.445884 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5rnb\" (UniqueName: \"kubernetes.io/projected/5df59239-94df-44ef-9b1c-2749bf22e7d5-kube-api-access-l5rnb\") pod \"dnsmasq-dns-5bf47b49b7-wvlfj\" (UID: \"5df59239-94df-44ef-9b1c-2749bf22e7d5\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.456014 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-lfms6"] Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.510983 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5971f2ec-035d-482c-8d75-eb8af348a864-dns-svc\") pod \"dnsmasq-dns-8554648995-lfms6\" (UID: \"5971f2ec-035d-482c-8d75-eb8af348a864\") " pod="openstack/dnsmasq-dns-8554648995-lfms6" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.511047 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5971f2ec-035d-482c-8d75-eb8af348a864-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-lfms6\" (UID: \"5971f2ec-035d-482c-8d75-eb8af348a864\") " pod="openstack/dnsmasq-dns-8554648995-lfms6" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.511077 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6xh7\" (UniqueName: \"kubernetes.io/projected/5971f2ec-035d-482c-8d75-eb8af348a864-kube-api-access-b6xh7\") pod \"dnsmasq-dns-8554648995-lfms6\" (UID: \"5971f2ec-035d-482c-8d75-eb8af348a864\") " pod="openstack/dnsmasq-dns-8554648995-lfms6" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.511105 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5971f2ec-035d-482c-8d75-eb8af348a864-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-lfms6\" (UID: \"5971f2ec-035d-482c-8d75-eb8af348a864\") " pod="openstack/dnsmasq-dns-8554648995-lfms6" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.511155 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5971f2ec-035d-482c-8d75-eb8af348a864-config\") pod \"dnsmasq-dns-8554648995-lfms6\" (UID: \"5971f2ec-035d-482c-8d75-eb8af348a864\") " pod="openstack/dnsmasq-dns-8554648995-lfms6" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.604698 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.612335 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5971f2ec-035d-482c-8d75-eb8af348a864-config\") pod \"dnsmasq-dns-8554648995-lfms6\" (UID: \"5971f2ec-035d-482c-8d75-eb8af348a864\") " pod="openstack/dnsmasq-dns-8554648995-lfms6" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.612411 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5971f2ec-035d-482c-8d75-eb8af348a864-dns-svc\") pod \"dnsmasq-dns-8554648995-lfms6\" (UID: \"5971f2ec-035d-482c-8d75-eb8af348a864\") " pod="openstack/dnsmasq-dns-8554648995-lfms6" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.612478 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5971f2ec-035d-482c-8d75-eb8af348a864-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-lfms6\" (UID: \"5971f2ec-035d-482c-8d75-eb8af348a864\") " pod="openstack/dnsmasq-dns-8554648995-lfms6" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.612508 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6xh7\" (UniqueName: \"kubernetes.io/projected/5971f2ec-035d-482c-8d75-eb8af348a864-kube-api-access-b6xh7\") pod \"dnsmasq-dns-8554648995-lfms6\" (UID: \"5971f2ec-035d-482c-8d75-eb8af348a864\") " pod="openstack/dnsmasq-dns-8554648995-lfms6" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.612565 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5971f2ec-035d-482c-8d75-eb8af348a864-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-lfms6\" (UID: \"5971f2ec-035d-482c-8d75-eb8af348a864\") " pod="openstack/dnsmasq-dns-8554648995-lfms6" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.613481 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5971f2ec-035d-482c-8d75-eb8af348a864-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-lfms6\" (UID: \"5971f2ec-035d-482c-8d75-eb8af348a864\") " pod="openstack/dnsmasq-dns-8554648995-lfms6" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.615344 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5971f2ec-035d-482c-8d75-eb8af348a864-dns-svc\") pod \"dnsmasq-dns-8554648995-lfms6\" (UID: \"5971f2ec-035d-482c-8d75-eb8af348a864\") " pod="openstack/dnsmasq-dns-8554648995-lfms6" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.615439 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5971f2ec-035d-482c-8d75-eb8af348a864-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-lfms6\" (UID: \"5971f2ec-035d-482c-8d75-eb8af348a864\") " pod="openstack/dnsmasq-dns-8554648995-lfms6" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.615554 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5971f2ec-035d-482c-8d75-eb8af348a864-config\") pod \"dnsmasq-dns-8554648995-lfms6\" (UID: \"5971f2ec-035d-482c-8d75-eb8af348a864\") " pod="openstack/dnsmasq-dns-8554648995-lfms6" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.632484 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6xh7\" (UniqueName: \"kubernetes.io/projected/5971f2ec-035d-482c-8d75-eb8af348a864-kube-api-access-b6xh7\") pod \"dnsmasq-dns-8554648995-lfms6\" (UID: \"5971f2ec-035d-482c-8d75-eb8af348a864\") " pod="openstack/dnsmasq-dns-8554648995-lfms6" Mar 18 15:55:43 crc kubenswrapper[4696]: I0318 15:55:43.806774 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-lfms6" Mar 18 15:55:45 crc kubenswrapper[4696]: I0318 15:55:45.792577 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-th8mn" Mar 18 15:55:45 crc kubenswrapper[4696]: I0318 15:55:45.816920 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-lggnn" Mar 18 15:55:45 crc kubenswrapper[4696]: I0318 15:55:45.853149 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/371f70ec-7c03-46b5-9bb9-a346c23aef4a-dns-svc\") pod \"371f70ec-7c03-46b5-9bb9-a346c23aef4a\" (UID: \"371f70ec-7c03-46b5-9bb9-a346c23aef4a\") " Mar 18 15:55:45 crc kubenswrapper[4696]: I0318 15:55:45.853220 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcw4p\" (UniqueName: \"kubernetes.io/projected/5ddbc33c-8c29-465b-9c61-b9899324afb4-kube-api-access-wcw4p\") pod \"5ddbc33c-8c29-465b-9c61-b9899324afb4\" (UID: \"5ddbc33c-8c29-465b-9c61-b9899324afb4\") " Mar 18 15:55:45 crc kubenswrapper[4696]: I0318 15:55:45.853242 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fzrh\" (UniqueName: \"kubernetes.io/projected/371f70ec-7c03-46b5-9bb9-a346c23aef4a-kube-api-access-2fzrh\") pod \"371f70ec-7c03-46b5-9bb9-a346c23aef4a\" (UID: \"371f70ec-7c03-46b5-9bb9-a346c23aef4a\") " Mar 18 15:55:45 crc kubenswrapper[4696]: I0318 15:55:45.853274 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ddbc33c-8c29-465b-9c61-b9899324afb4-dns-svc\") pod \"5ddbc33c-8c29-465b-9c61-b9899324afb4\" (UID: \"5ddbc33c-8c29-465b-9c61-b9899324afb4\") " Mar 18 15:55:45 crc kubenswrapper[4696]: I0318 15:55:45.853297 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/371f70ec-7c03-46b5-9bb9-a346c23aef4a-config\") pod \"371f70ec-7c03-46b5-9bb9-a346c23aef4a\" (UID: \"371f70ec-7c03-46b5-9bb9-a346c23aef4a\") " Mar 18 15:55:45 crc kubenswrapper[4696]: I0318 15:55:45.853322 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ddbc33c-8c29-465b-9c61-b9899324afb4-config\") pod \"5ddbc33c-8c29-465b-9c61-b9899324afb4\" (UID: \"5ddbc33c-8c29-465b-9c61-b9899324afb4\") " Mar 18 15:55:45 crc kubenswrapper[4696]: I0318 15:55:45.854124 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ddbc33c-8c29-465b-9c61-b9899324afb4-config" (OuterVolumeSpecName: "config") pod "5ddbc33c-8c29-465b-9c61-b9899324afb4" (UID: "5ddbc33c-8c29-465b-9c61-b9899324afb4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:55:45 crc kubenswrapper[4696]: I0318 15:55:45.854245 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/371f70ec-7c03-46b5-9bb9-a346c23aef4a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "371f70ec-7c03-46b5-9bb9-a346c23aef4a" (UID: "371f70ec-7c03-46b5-9bb9-a346c23aef4a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:55:45 crc kubenswrapper[4696]: I0318 15:55:45.854536 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ddbc33c-8c29-465b-9c61-b9899324afb4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5ddbc33c-8c29-465b-9c61-b9899324afb4" (UID: "5ddbc33c-8c29-465b-9c61-b9899324afb4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:55:45 crc kubenswrapper[4696]: I0318 15:55:45.855114 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/371f70ec-7c03-46b5-9bb9-a346c23aef4a-config" (OuterVolumeSpecName: "config") pod "371f70ec-7c03-46b5-9bb9-a346c23aef4a" (UID: "371f70ec-7c03-46b5-9bb9-a346c23aef4a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:55:45 crc kubenswrapper[4696]: I0318 15:55:45.857219 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/371f70ec-7c03-46b5-9bb9-a346c23aef4a-kube-api-access-2fzrh" (OuterVolumeSpecName: "kube-api-access-2fzrh") pod "371f70ec-7c03-46b5-9bb9-a346c23aef4a" (UID: "371f70ec-7c03-46b5-9bb9-a346c23aef4a"). InnerVolumeSpecName "kube-api-access-2fzrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:55:45 crc kubenswrapper[4696]: I0318 15:55:45.859378 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ddbc33c-8c29-465b-9c61-b9899324afb4-kube-api-access-wcw4p" (OuterVolumeSpecName: "kube-api-access-wcw4p") pod "5ddbc33c-8c29-465b-9c61-b9899324afb4" (UID: "5ddbc33c-8c29-465b-9c61-b9899324afb4"). InnerVolumeSpecName "kube-api-access-wcw4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:55:45 crc kubenswrapper[4696]: I0318 15:55:45.955300 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/371f70ec-7c03-46b5-9bb9-a346c23aef4a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:55:45 crc kubenswrapper[4696]: I0318 15:55:45.955340 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcw4p\" (UniqueName: \"kubernetes.io/projected/5ddbc33c-8c29-465b-9c61-b9899324afb4-kube-api-access-wcw4p\") on node \"crc\" DevicePath \"\"" Mar 18 15:55:45 crc kubenswrapper[4696]: I0318 15:55:45.955352 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fzrh\" (UniqueName: \"kubernetes.io/projected/371f70ec-7c03-46b5-9bb9-a346c23aef4a-kube-api-access-2fzrh\") on node \"crc\" DevicePath \"\"" Mar 18 15:55:45 crc kubenswrapper[4696]: I0318 15:55:45.955360 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ddbc33c-8c29-465b-9c61-b9899324afb4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:55:45 crc kubenswrapper[4696]: I0318 15:55:45.955369 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/371f70ec-7c03-46b5-9bb9-a346c23aef4a-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:55:45 crc kubenswrapper[4696]: I0318 15:55:45.955377 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ddbc33c-8c29-465b-9c61-b9899324afb4-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:55:46 crc kubenswrapper[4696]: I0318 15:55:46.622901 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-lggnn" event={"ID":"371f70ec-7c03-46b5-9bb9-a346c23aef4a","Type":"ContainerDied","Data":"6854f83723f6664d5be9e48f956241d11db0e4f661387755dee68ab2c07498f6"} Mar 18 15:55:46 crc kubenswrapper[4696]: I0318 15:55:46.622919 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-lggnn" Mar 18 15:55:46 crc kubenswrapper[4696]: I0318 15:55:46.627187 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-th8mn" event={"ID":"5ddbc33c-8c29-465b-9c61-b9899324afb4","Type":"ContainerDied","Data":"4c4f83141a4b4054ff2303c756b6eee83941e4bb2a86e7b38f280868f6d30dda"} Mar 18 15:55:46 crc kubenswrapper[4696]: I0318 15:55:46.627249 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-th8mn" Mar 18 15:55:46 crc kubenswrapper[4696]: I0318 15:55:46.695621 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-lggnn"] Mar 18 15:55:46 crc kubenswrapper[4696]: I0318 15:55:46.696176 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-lggnn"] Mar 18 15:55:46 crc kubenswrapper[4696]: I0318 15:55:46.727358 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-th8mn"] Mar 18 15:55:46 crc kubenswrapper[4696]: I0318 15:55:46.733754 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-th8mn"] Mar 18 15:55:47 crc kubenswrapper[4696]: I0318 15:55:47.607932 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="371f70ec-7c03-46b5-9bb9-a346c23aef4a" path="/var/lib/kubelet/pods/371f70ec-7c03-46b5-9bb9-a346c23aef4a/volumes" Mar 18 15:55:47 crc kubenswrapper[4696]: I0318 15:55:47.608919 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ddbc33c-8c29-465b-9c61-b9899324afb4" path="/var/lib/kubelet/pods/5ddbc33c-8c29-465b-9c61-b9899324afb4/volumes" Mar 18 15:55:48 crc kubenswrapper[4696]: I0318 15:55:48.609036 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-wvlfj"] Mar 18 15:55:48 crc kubenswrapper[4696]: I0318 15:55:48.614866 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hb4gt"] Mar 18 15:55:48 crc kubenswrapper[4696]: I0318 15:55:48.711962 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-lfms6"] Mar 18 15:55:48 crc kubenswrapper[4696]: W0318 15:55:48.955041 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5971f2ec_035d_482c_8d75_eb8af348a864.slice/crio-aa46b00b66724a8d86cbd42f1b2f96a0e41d24c0e3f1d095e90772f23dcb6e4a WatchSource:0}: Error finding container aa46b00b66724a8d86cbd42f1b2f96a0e41d24c0e3f1d095e90772f23dcb6e4a: Status 404 returned error can't find the container with id aa46b00b66724a8d86cbd42f1b2f96a0e41d24c0e3f1d095e90772f23dcb6e4a Mar 18 15:55:48 crc kubenswrapper[4696]: W0318 15:55:48.958923 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b77b78e_7226_4d19_a9b7_190ad5248eb7.slice/crio-7d480c78b541fb85db11770719203dc9bb77fe8c706fec2dd64600b97bfb42fb WatchSource:0}: Error finding container 7d480c78b541fb85db11770719203dc9bb77fe8c706fec2dd64600b97bfb42fb: Status 404 returned error can't find the container with id 7d480c78b541fb85db11770719203dc9bb77fe8c706fec2dd64600b97bfb42fb Mar 18 15:55:49 crc kubenswrapper[4696]: I0318 15:55:49.664962 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" event={"ID":"5df59239-94df-44ef-9b1c-2749bf22e7d5","Type":"ContainerStarted","Data":"fc08011bdf9bb35280a69cc76c5fb39d2f1611281e1c02a44091c960d0b776a0"} Mar 18 15:55:49 crc kubenswrapper[4696]: I0318 15:55:49.672049 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-lfms6" event={"ID":"5971f2ec-035d-482c-8d75-eb8af348a864","Type":"ContainerStarted","Data":"aa46b00b66724a8d86cbd42f1b2f96a0e41d24c0e3f1d095e90772f23dcb6e4a"} Mar 18 15:55:49 crc kubenswrapper[4696]: I0318 15:55:49.678818 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dd1c1de5-fda6-4306-bde0-736fd76a8f31","Type":"ContainerStarted","Data":"da957ef12e62fdbaabca51934361fd4bc7d7cb1df64ecbd551e212fb82a4fec1"} Mar 18 15:55:49 crc kubenswrapper[4696]: I0318 15:55:49.685260 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-x4bkz" event={"ID":"a37c7e7f-336d-4e95-b9ea-3750b49d4117","Type":"ContainerStarted","Data":"daf6dfe5afe75d4bdd781c6a23035d20ac2f64d3ddf885cae4ada178ba7f88c8"} Mar 18 15:55:49 crc kubenswrapper[4696]: I0318 15:55:49.697404 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"87f019c6-a59d-4465-8fb8-c47b198c513b","Type":"ContainerStarted","Data":"0a841721ec4d7e8df712645fe7a02b211fb6a71ed20fb884ba671b6339951cd5"} Mar 18 15:55:49 crc kubenswrapper[4696]: I0318 15:55:49.708104 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510","Type":"ContainerStarted","Data":"35e1035df5fb0d14ce98c6bde2fe74e92c67557cc7b7e85dee71afd19cb88d99"} Mar 18 15:55:49 crc kubenswrapper[4696]: I0318 15:55:49.724638 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vb7xn" event={"ID":"efa7f696-eda9-4cd4-953b-0a24e9935290","Type":"ContainerStarted","Data":"514def90935432c4f56998acd7cab115aea97cf1fc2da390bfb156ea0928f3e9"} Mar 18 15:55:49 crc kubenswrapper[4696]: I0318 15:55:49.726127 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-vb7xn" Mar 18 15:55:49 crc kubenswrapper[4696]: I0318 15:55:49.729139 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"40ba5312-af64-42b5-8f85-cd32aa1dd530","Type":"ContainerStarted","Data":"6555ec005e32e3316a2e560ccc156e6862b4dff3d46764f1c5215afb69a2b996"} Mar 18 15:55:49 crc kubenswrapper[4696]: I0318 15:55:49.729713 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 15:55:49 crc kubenswrapper[4696]: I0318 15:55:49.737970 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hb4gt" event={"ID":"0b77b78e-7226-4d19-a9b7-190ad5248eb7","Type":"ContainerStarted","Data":"7d480c78b541fb85db11770719203dc9bb77fe8c706fec2dd64600b97bfb42fb"} Mar 18 15:55:49 crc kubenswrapper[4696]: I0318 15:55:49.748957 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"50019b99-f0df-4582-ab2a-49f761bc0aa7","Type":"ContainerStarted","Data":"eeab91a44d9ea2cb646c2e6475ffa58d87ea2a2148e47a12289499c325fab3e1"} Mar 18 15:55:49 crc kubenswrapper[4696]: I0318 15:55:49.752708 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"88bcbc43-a512-4f0f-8ce6-e6fd9905df8b","Type":"ContainerStarted","Data":"f3c32614f1f6af3a7017d0ba094af9e7737b3773690fa4535cb687838437cef9"} Mar 18 15:55:49 crc kubenswrapper[4696]: I0318 15:55:49.752979 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 18 15:55:49 crc kubenswrapper[4696]: I0318 15:55:49.838696 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=18.54988266 podStartE2EDuration="25.838677454s" podCreationTimestamp="2026-03-18 15:55:24 +0000 UTC" firstStartedPulling="2026-03-18 15:55:40.734686108 +0000 UTC m=+1183.740860314" lastFinishedPulling="2026-03-18 15:55:48.023480902 +0000 UTC m=+1191.029655108" observedRunningTime="2026-03-18 15:55:49.810510378 +0000 UTC m=+1192.816684584" watchObservedRunningTime="2026-03-18 15:55:49.838677454 +0000 UTC m=+1192.844851660" Mar 18 15:55:49 crc kubenswrapper[4696]: I0318 15:55:49.874467 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-vb7xn" podStartSLOduration=11.369698716 podStartE2EDuration="18.87443423s" podCreationTimestamp="2026-03-18 15:55:31 +0000 UTC" firstStartedPulling="2026-03-18 15:55:40.867859095 +0000 UTC m=+1183.874033301" lastFinishedPulling="2026-03-18 15:55:48.372594609 +0000 UTC m=+1191.378768815" observedRunningTime="2026-03-18 15:55:49.869469796 +0000 UTC m=+1192.875644022" watchObservedRunningTime="2026-03-18 15:55:49.87443423 +0000 UTC m=+1192.880608436" Mar 18 15:55:49 crc kubenswrapper[4696]: I0318 15:55:49.879960 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=15.661777637 podStartE2EDuration="23.879947718s" podCreationTimestamp="2026-03-18 15:55:26 +0000 UTC" firstStartedPulling="2026-03-18 15:55:40.872388578 +0000 UTC m=+1183.878562784" lastFinishedPulling="2026-03-18 15:55:49.090558649 +0000 UTC m=+1192.096732865" observedRunningTime="2026-03-18 15:55:49.838306975 +0000 UTC m=+1192.844481201" watchObservedRunningTime="2026-03-18 15:55:49.879947718 +0000 UTC m=+1192.886121914" Mar 18 15:55:50 crc kubenswrapper[4696]: I0318 15:55:50.768871 4696 generic.go:334] "Generic (PLEG): container finished" podID="a37c7e7f-336d-4e95-b9ea-3750b49d4117" containerID="daf6dfe5afe75d4bdd781c6a23035d20ac2f64d3ddf885cae4ada178ba7f88c8" exitCode=0 Mar 18 15:55:50 crc kubenswrapper[4696]: I0318 15:55:50.768964 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-x4bkz" event={"ID":"a37c7e7f-336d-4e95-b9ea-3750b49d4117","Type":"ContainerDied","Data":"daf6dfe5afe75d4bdd781c6a23035d20ac2f64d3ddf885cae4ada178ba7f88c8"} Mar 18 15:55:50 crc kubenswrapper[4696]: I0318 15:55:50.773384 4696 generic.go:334] "Generic (PLEG): container finished" podID="5df59239-94df-44ef-9b1c-2749bf22e7d5" containerID="694045afe111afe7a73e801e335caaa39530e65d13f8eefed57e7c2b55f03c1f" exitCode=0 Mar 18 15:55:50 crc kubenswrapper[4696]: I0318 15:55:50.773496 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" event={"ID":"5df59239-94df-44ef-9b1c-2749bf22e7d5","Type":"ContainerDied","Data":"694045afe111afe7a73e801e335caaa39530e65d13f8eefed57e7c2b55f03c1f"} Mar 18 15:55:50 crc kubenswrapper[4696]: I0318 15:55:50.775208 4696 generic.go:334] "Generic (PLEG): container finished" podID="5971f2ec-035d-482c-8d75-eb8af348a864" containerID="5c2930df67b3e34ec223b0a8a471f19e85aaf85fcb7cf5935b3f1230a847e711" exitCode=0 Mar 18 15:55:50 crc kubenswrapper[4696]: I0318 15:55:50.775338 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-lfms6" event={"ID":"5971f2ec-035d-482c-8d75-eb8af348a864","Type":"ContainerDied","Data":"5c2930df67b3e34ec223b0a8a471f19e85aaf85fcb7cf5935b3f1230a847e711"} Mar 18 15:55:51 crc kubenswrapper[4696]: I0318 15:55:51.786266 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-x4bkz" event={"ID":"a37c7e7f-336d-4e95-b9ea-3750b49d4117","Type":"ContainerStarted","Data":"c5e0bf11754d7ade146b7cfb4b981b3dc57905eab15ff551c23be2ae0e14824f"} Mar 18 15:55:51 crc kubenswrapper[4696]: I0318 15:55:51.788915 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" event={"ID":"5df59239-94df-44ef-9b1c-2749bf22e7d5","Type":"ContainerStarted","Data":"3ef25a1e5bcdc98b80ea46fbada615161f8bd2c7ebbb0d404572294f8ed015b2"} Mar 18 15:55:51 crc kubenswrapper[4696]: I0318 15:55:51.789002 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" Mar 18 15:55:51 crc kubenswrapper[4696]: I0318 15:55:51.795222 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-lfms6" event={"ID":"5971f2ec-035d-482c-8d75-eb8af348a864","Type":"ContainerStarted","Data":"82b976fe11924f48a0159d2fada52277301e8e608e0d79a2bd68980aa9e8fbb4"} Mar 18 15:55:51 crc kubenswrapper[4696]: I0318 15:55:51.795715 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-lfms6" Mar 18 15:55:51 crc kubenswrapper[4696]: I0318 15:55:51.830431 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" podStartSLOduration=8.267440515 podStartE2EDuration="8.830395321s" podCreationTimestamp="2026-03-18 15:55:43 +0000 UTC" firstStartedPulling="2026-03-18 15:55:48.952782267 +0000 UTC m=+1191.958956473" lastFinishedPulling="2026-03-18 15:55:49.515737083 +0000 UTC m=+1192.521911279" observedRunningTime="2026-03-18 15:55:51.821931369 +0000 UTC m=+1194.828105595" watchObservedRunningTime="2026-03-18 15:55:51.830395321 +0000 UTC m=+1194.836569527" Mar 18 15:55:51 crc kubenswrapper[4696]: I0318 15:55:51.851708 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-lfms6" podStartSLOduration=8.216455518 podStartE2EDuration="8.851688444s" podCreationTimestamp="2026-03-18 15:55:43 +0000 UTC" firstStartedPulling="2026-03-18 15:55:48.995803245 +0000 UTC m=+1192.001977471" lastFinishedPulling="2026-03-18 15:55:49.631036191 +0000 UTC m=+1192.637210397" observedRunningTime="2026-03-18 15:55:51.847676794 +0000 UTC m=+1194.853851000" watchObservedRunningTime="2026-03-18 15:55:51.851688444 +0000 UTC m=+1194.857862650" Mar 18 15:55:54 crc kubenswrapper[4696]: I0318 15:55:54.654806 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 18 15:55:54 crc kubenswrapper[4696]: I0318 15:55:54.817461 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hb4gt" event={"ID":"0b77b78e-7226-4d19-a9b7-190ad5248eb7","Type":"ContainerStarted","Data":"1bd6104edaba9ca87ab022dba7f0f339bf6989c4a0d1dae74681fc0900a4044e"} Mar 18 15:55:54 crc kubenswrapper[4696]: I0318 15:55:54.819272 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"50019b99-f0df-4582-ab2a-49f761bc0aa7","Type":"ContainerStarted","Data":"5a0a65a449d9953d474e10cdd7d9659f366411fd7ffe2dbe66a1bc8ed20f7f5c"} Mar 18 15:55:54 crc kubenswrapper[4696]: I0318 15:55:54.821743 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8df5e2e0-02fe-4be7-ae7d-f92ea79ce510","Type":"ContainerStarted","Data":"3289d7a524237750e913907f398dda6e92cf7d2972192f6ad60eb356fb3e2684"} Mar 18 15:55:54 crc kubenswrapper[4696]: I0318 15:55:54.824510 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-x4bkz" event={"ID":"a37c7e7f-336d-4e95-b9ea-3750b49d4117","Type":"ContainerStarted","Data":"d9ed0ba26cf28728bb90b69a8935ca18a226f100d26f7fbe9bff6e8030d8435d"} Mar 18 15:55:54 crc kubenswrapper[4696]: I0318 15:55:54.824873 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-x4bkz" Mar 18 15:55:54 crc kubenswrapper[4696]: I0318 15:55:54.824946 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-x4bkz" Mar 18 15:55:54 crc kubenswrapper[4696]: I0318 15:55:54.842675 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-hb4gt" podStartSLOduration=8.151619757 podStartE2EDuration="12.842645528s" podCreationTimestamp="2026-03-18 15:55:42 +0000 UTC" firstStartedPulling="2026-03-18 15:55:48.965212119 +0000 UTC m=+1191.971386325" lastFinishedPulling="2026-03-18 15:55:53.65623789 +0000 UTC m=+1196.662412096" observedRunningTime="2026-03-18 15:55:54.840513015 +0000 UTC m=+1197.846687231" watchObservedRunningTime="2026-03-18 15:55:54.842645528 +0000 UTC m=+1197.848819734" Mar 18 15:55:54 crc kubenswrapper[4696]: I0318 15:55:54.886624 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=13.326998576 podStartE2EDuration="25.88660652s" podCreationTimestamp="2026-03-18 15:55:29 +0000 UTC" firstStartedPulling="2026-03-18 15:55:41.08919018 +0000 UTC m=+1184.095364386" lastFinishedPulling="2026-03-18 15:55:53.648798124 +0000 UTC m=+1196.654972330" observedRunningTime="2026-03-18 15:55:54.883731688 +0000 UTC m=+1197.889905904" watchObservedRunningTime="2026-03-18 15:55:54.88660652 +0000 UTC m=+1197.892780726" Mar 18 15:55:54 crc kubenswrapper[4696]: I0318 15:55:54.919292 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-x4bkz" podStartSLOduration=17.988508433 podStartE2EDuration="23.919271568s" podCreationTimestamp="2026-03-18 15:55:31 +0000 UTC" firstStartedPulling="2026-03-18 15:55:42.126272887 +0000 UTC m=+1185.132447093" lastFinishedPulling="2026-03-18 15:55:48.057036022 +0000 UTC m=+1191.063210228" observedRunningTime="2026-03-18 15:55:54.915497754 +0000 UTC m=+1197.921671980" watchObservedRunningTime="2026-03-18 15:55:54.919271568 +0000 UTC m=+1197.925445774" Mar 18 15:55:54 crc kubenswrapper[4696]: I0318 15:55:54.955376 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=10.53307363 podStartE2EDuration="22.955350542s" podCreationTimestamp="2026-03-18 15:55:32 +0000 UTC" firstStartedPulling="2026-03-18 15:55:41.20971145 +0000 UTC m=+1184.215885656" lastFinishedPulling="2026-03-18 15:55:53.631988362 +0000 UTC m=+1196.638162568" observedRunningTime="2026-03-18 15:55:54.944691035 +0000 UTC m=+1197.950865261" watchObservedRunningTime="2026-03-18 15:55:54.955350542 +0000 UTC m=+1197.961524748" Mar 18 15:55:55 crc kubenswrapper[4696]: I0318 15:55:55.008345 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:55 crc kubenswrapper[4696]: I0318 15:55:55.047662 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:55 crc kubenswrapper[4696]: I0318 15:55:55.836193 4696 generic.go:334] "Generic (PLEG): container finished" podID="dd1c1de5-fda6-4306-bde0-736fd76a8f31" containerID="da957ef12e62fdbaabca51934361fd4bc7d7cb1df64ecbd551e212fb82a4fec1" exitCode=0 Mar 18 15:55:55 crc kubenswrapper[4696]: I0318 15:55:55.837062 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dd1c1de5-fda6-4306-bde0-736fd76a8f31","Type":"ContainerDied","Data":"da957ef12e62fdbaabca51934361fd4bc7d7cb1df64ecbd551e212fb82a4fec1"} Mar 18 15:55:55 crc kubenswrapper[4696]: I0318 15:55:55.838040 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:55 crc kubenswrapper[4696]: I0318 15:55:55.898371 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 18 15:55:56 crc kubenswrapper[4696]: I0318 15:55:56.846570 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dd1c1de5-fda6-4306-bde0-736fd76a8f31","Type":"ContainerStarted","Data":"c1157f5582eca0d472e3a7d1f3f1bdd2ee5756a6d446346756f78f76cf5c7e76"} Mar 18 15:55:56 crc kubenswrapper[4696]: I0318 15:55:56.848146 4696 generic.go:334] "Generic (PLEG): container finished" podID="87f019c6-a59d-4465-8fb8-c47b198c513b" containerID="0a841721ec4d7e8df712645fe7a02b211fb6a71ed20fb884ba671b6339951cd5" exitCode=0 Mar 18 15:55:56 crc kubenswrapper[4696]: I0318 15:55:56.848212 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"87f019c6-a59d-4465-8fb8-c47b198c513b","Type":"ContainerDied","Data":"0a841721ec4d7e8df712645fe7a02b211fb6a71ed20fb884ba671b6339951cd5"} Mar 18 15:55:56 crc kubenswrapper[4696]: I0318 15:55:56.880364 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=28.005648571 podStartE2EDuration="35.880322996s" podCreationTimestamp="2026-03-18 15:55:21 +0000 UTC" firstStartedPulling="2026-03-18 15:55:40.211112898 +0000 UTC m=+1183.217287104" lastFinishedPulling="2026-03-18 15:55:48.085787323 +0000 UTC m=+1191.091961529" observedRunningTime="2026-03-18 15:55:56.873648888 +0000 UTC m=+1199.879823114" watchObservedRunningTime="2026-03-18 15:55:56.880322996 +0000 UTC m=+1199.886497202" Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.032662 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-wvlfj"] Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.032901 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" podUID="5df59239-94df-44ef-9b1c-2749bf22e7d5" containerName="dnsmasq-dns" containerID="cri-o://3ef25a1e5bcdc98b80ea46fbada615161f8bd2c7ebbb0d404572294f8ed015b2" gracePeriod=10 Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.037111 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.042400 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.086685 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-5wrdf"] Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.093129 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.118305 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-5wrdf"] Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.207754 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4151e21-6506-415e-9dbe-3fe4389838b6-config\") pod \"dnsmasq-dns-b8fbc5445-5wrdf\" (UID: \"c4151e21-6506-415e-9dbe-3fe4389838b6\") " pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.207818 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k67ww\" (UniqueName: \"kubernetes.io/projected/c4151e21-6506-415e-9dbe-3fe4389838b6-kube-api-access-k67ww\") pod \"dnsmasq-dns-b8fbc5445-5wrdf\" (UID: \"c4151e21-6506-415e-9dbe-3fe4389838b6\") " pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.207880 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4151e21-6506-415e-9dbe-3fe4389838b6-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-5wrdf\" (UID: \"c4151e21-6506-415e-9dbe-3fe4389838b6\") " pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.207905 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4151e21-6506-415e-9dbe-3fe4389838b6-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-5wrdf\" (UID: \"c4151e21-6506-415e-9dbe-3fe4389838b6\") " pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.207922 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4151e21-6506-415e-9dbe-3fe4389838b6-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-5wrdf\" (UID: \"c4151e21-6506-415e-9dbe-3fe4389838b6\") " pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.309687 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4151e21-6506-415e-9dbe-3fe4389838b6-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-5wrdf\" (UID: \"c4151e21-6506-415e-9dbe-3fe4389838b6\") " pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.309968 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4151e21-6506-415e-9dbe-3fe4389838b6-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-5wrdf\" (UID: \"c4151e21-6506-415e-9dbe-3fe4389838b6\") " pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.309991 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4151e21-6506-415e-9dbe-3fe4389838b6-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-5wrdf\" (UID: \"c4151e21-6506-415e-9dbe-3fe4389838b6\") " pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.310100 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4151e21-6506-415e-9dbe-3fe4389838b6-config\") pod \"dnsmasq-dns-b8fbc5445-5wrdf\" (UID: \"c4151e21-6506-415e-9dbe-3fe4389838b6\") " pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.310135 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k67ww\" (UniqueName: \"kubernetes.io/projected/c4151e21-6506-415e-9dbe-3fe4389838b6-kube-api-access-k67ww\") pod \"dnsmasq-dns-b8fbc5445-5wrdf\" (UID: \"c4151e21-6506-415e-9dbe-3fe4389838b6\") " pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.311254 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4151e21-6506-415e-9dbe-3fe4389838b6-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-5wrdf\" (UID: \"c4151e21-6506-415e-9dbe-3fe4389838b6\") " pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.311822 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4151e21-6506-415e-9dbe-3fe4389838b6-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-5wrdf\" (UID: \"c4151e21-6506-415e-9dbe-3fe4389838b6\") " pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.312417 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4151e21-6506-415e-9dbe-3fe4389838b6-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-5wrdf\" (UID: \"c4151e21-6506-415e-9dbe-3fe4389838b6\") " pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.314334 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4151e21-6506-415e-9dbe-3fe4389838b6-config\") pod \"dnsmasq-dns-b8fbc5445-5wrdf\" (UID: \"c4151e21-6506-415e-9dbe-3fe4389838b6\") " pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.335888 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k67ww\" (UniqueName: \"kubernetes.io/projected/c4151e21-6506-415e-9dbe-3fe4389838b6-kube-api-access-k67ww\") pod \"dnsmasq-dns-b8fbc5445-5wrdf\" (UID: \"c4151e21-6506-415e-9dbe-3fe4389838b6\") " pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.446802 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.611541 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.704614 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.867063 4696 generic.go:334] "Generic (PLEG): container finished" podID="5df59239-94df-44ef-9b1c-2749bf22e7d5" containerID="3ef25a1e5bcdc98b80ea46fbada615161f8bd2c7ebbb0d404572294f8ed015b2" exitCode=0 Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.867131 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" event={"ID":"5df59239-94df-44ef-9b1c-2749bf22e7d5","Type":"ContainerDied","Data":"3ef25a1e5bcdc98b80ea46fbada615161f8bd2c7ebbb0d404572294f8ed015b2"} Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.874548 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"87f019c6-a59d-4465-8fb8-c47b198c513b","Type":"ContainerStarted","Data":"1af79baa7e2bc36d39b0ed3b63e8bb962ee7312aa26f42a0a22b6fa5efe607a2"} Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.875781 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.909586 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=27.608537455 podStartE2EDuration="34.909565245s" podCreationTimestamp="2026-03-18 15:55:23 +0000 UTC" firstStartedPulling="2026-03-18 15:55:40.873610099 +0000 UTC m=+1183.879784305" lastFinishedPulling="2026-03-18 15:55:48.174637889 +0000 UTC m=+1191.180812095" observedRunningTime="2026-03-18 15:55:57.906428907 +0000 UTC m=+1200.912603113" watchObservedRunningTime="2026-03-18 15:55:57.909565245 +0000 UTC m=+1200.915739451" Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.926924 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-5wrdf"] Mar 18 15:55:57 crc kubenswrapper[4696]: I0318 15:55:57.956004 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.130377 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.132263 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.135244 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.135596 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.135641 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.135826 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-g82x2" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.167351 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.202298 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.205505 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.211388 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-tqmfl" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.211626 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.211781 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.226254 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.227153 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.247280 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.267408 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d256733-b9f7-484d-873a-b77e062f63c8-scripts\") pod \"ovn-northd-0\" (UID: \"6d256733-b9f7-484d-873a-b77e062f63c8\") " pod="openstack/ovn-northd-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.268836 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6d256733-b9f7-484d-873a-b77e062f63c8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6d256733-b9f7-484d-873a-b77e062f63c8\") " pod="openstack/ovn-northd-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.269044 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d256733-b9f7-484d-873a-b77e062f63c8-config\") pod \"ovn-northd-0\" (UID: \"6d256733-b9f7-484d-873a-b77e062f63c8\") " pod="openstack/ovn-northd-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.269137 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d256733-b9f7-484d-873a-b77e062f63c8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6d256733-b9f7-484d-873a-b77e062f63c8\") " pod="openstack/ovn-northd-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.269232 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d256733-b9f7-484d-873a-b77e062f63c8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6d256733-b9f7-484d-873a-b77e062f63c8\") " pod="openstack/ovn-northd-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.269453 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2hjc\" (UniqueName: \"kubernetes.io/projected/6d256733-b9f7-484d-873a-b77e062f63c8-kube-api-access-c2hjc\") pod \"ovn-northd-0\" (UID: \"6d256733-b9f7-484d-873a-b77e062f63c8\") " pod="openstack/ovn-northd-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.269569 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d256733-b9f7-484d-873a-b77e062f63c8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6d256733-b9f7-484d-873a-b77e062f63c8\") " pod="openstack/ovn-northd-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.370654 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5df59239-94df-44ef-9b1c-2749bf22e7d5-ovsdbserver-nb\") pod \"5df59239-94df-44ef-9b1c-2749bf22e7d5\" (UID: \"5df59239-94df-44ef-9b1c-2749bf22e7d5\") " Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.370807 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5df59239-94df-44ef-9b1c-2749bf22e7d5-dns-svc\") pod \"5df59239-94df-44ef-9b1c-2749bf22e7d5\" (UID: \"5df59239-94df-44ef-9b1c-2749bf22e7d5\") " Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.370962 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5df59239-94df-44ef-9b1c-2749bf22e7d5-config\") pod \"5df59239-94df-44ef-9b1c-2749bf22e7d5\" (UID: \"5df59239-94df-44ef-9b1c-2749bf22e7d5\") " Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.371070 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5rnb\" (UniqueName: \"kubernetes.io/projected/5df59239-94df-44ef-9b1c-2749bf22e7d5-kube-api-access-l5rnb\") pod \"5df59239-94df-44ef-9b1c-2749bf22e7d5\" (UID: \"5df59239-94df-44ef-9b1c-2749bf22e7d5\") " Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.371467 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2hjc\" (UniqueName: \"kubernetes.io/projected/6d256733-b9f7-484d-873a-b77e062f63c8-kube-api-access-c2hjc\") pod \"ovn-northd-0\" (UID: \"6d256733-b9f7-484d-873a-b77e062f63c8\") " pod="openstack/ovn-northd-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.372004 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d256733-b9f7-484d-873a-b77e062f63c8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6d256733-b9f7-484d-873a-b77e062f63c8\") " pod="openstack/ovn-northd-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.372113 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acfc5351-8c75-4362-8e66-b9ade04d74eb-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"acfc5351-8c75-4362-8e66-b9ade04d74eb\") " pod="openstack/swift-storage-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.372182 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d256733-b9f7-484d-873a-b77e062f63c8-scripts\") pod \"ovn-northd-0\" (UID: \"6d256733-b9f7-484d-873a-b77e062f63c8\") " pod="openstack/ovn-northd-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.372217 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6d256733-b9f7-484d-873a-b77e062f63c8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6d256733-b9f7-484d-873a-b77e062f63c8\") " pod="openstack/ovn-northd-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.372259 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/acfc5351-8c75-4362-8e66-b9ade04d74eb-cache\") pod \"swift-storage-0\" (UID: \"acfc5351-8c75-4362-8e66-b9ade04d74eb\") " pod="openstack/swift-storage-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.372277 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/acfc5351-8c75-4362-8e66-b9ade04d74eb-etc-swift\") pod \"swift-storage-0\" (UID: \"acfc5351-8c75-4362-8e66-b9ade04d74eb\") " pod="openstack/swift-storage-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.372323 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d256733-b9f7-484d-873a-b77e062f63c8-config\") pod \"ovn-northd-0\" (UID: \"6d256733-b9f7-484d-873a-b77e062f63c8\") " pod="openstack/ovn-northd-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.372342 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d256733-b9f7-484d-873a-b77e062f63c8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6d256733-b9f7-484d-873a-b77e062f63c8\") " pod="openstack/ovn-northd-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.372367 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwgq7\" (UniqueName: \"kubernetes.io/projected/acfc5351-8c75-4362-8e66-b9ade04d74eb-kube-api-access-jwgq7\") pod \"swift-storage-0\" (UID: \"acfc5351-8c75-4362-8e66-b9ade04d74eb\") " pod="openstack/swift-storage-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.372403 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d256733-b9f7-484d-873a-b77e062f63c8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6d256733-b9f7-484d-873a-b77e062f63c8\") " pod="openstack/ovn-northd-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.372442 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"acfc5351-8c75-4362-8e66-b9ade04d74eb\") " pod="openstack/swift-storage-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.372472 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/acfc5351-8c75-4362-8e66-b9ade04d74eb-lock\") pod \"swift-storage-0\" (UID: \"acfc5351-8c75-4362-8e66-b9ade04d74eb\") " pod="openstack/swift-storage-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.373128 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6d256733-b9f7-484d-873a-b77e062f63c8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6d256733-b9f7-484d-873a-b77e062f63c8\") " pod="openstack/ovn-northd-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.374213 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d256733-b9f7-484d-873a-b77e062f63c8-config\") pod \"ovn-northd-0\" (UID: \"6d256733-b9f7-484d-873a-b77e062f63c8\") " pod="openstack/ovn-northd-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.374778 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d256733-b9f7-484d-873a-b77e062f63c8-scripts\") pod \"ovn-northd-0\" (UID: \"6d256733-b9f7-484d-873a-b77e062f63c8\") " pod="openstack/ovn-northd-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.377739 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d256733-b9f7-484d-873a-b77e062f63c8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6d256733-b9f7-484d-873a-b77e062f63c8\") " pod="openstack/ovn-northd-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.378416 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d256733-b9f7-484d-873a-b77e062f63c8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6d256733-b9f7-484d-873a-b77e062f63c8\") " pod="openstack/ovn-northd-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.378466 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d256733-b9f7-484d-873a-b77e062f63c8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6d256733-b9f7-484d-873a-b77e062f63c8\") " pod="openstack/ovn-northd-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.382447 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5df59239-94df-44ef-9b1c-2749bf22e7d5-kube-api-access-l5rnb" (OuterVolumeSpecName: "kube-api-access-l5rnb") pod "5df59239-94df-44ef-9b1c-2749bf22e7d5" (UID: "5df59239-94df-44ef-9b1c-2749bf22e7d5"). InnerVolumeSpecName "kube-api-access-l5rnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.392257 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2hjc\" (UniqueName: \"kubernetes.io/projected/6d256733-b9f7-484d-873a-b77e062f63c8-kube-api-access-c2hjc\") pod \"ovn-northd-0\" (UID: \"6d256733-b9f7-484d-873a-b77e062f63c8\") " pod="openstack/ovn-northd-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.421149 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5df59239-94df-44ef-9b1c-2749bf22e7d5-config" (OuterVolumeSpecName: "config") pod "5df59239-94df-44ef-9b1c-2749bf22e7d5" (UID: "5df59239-94df-44ef-9b1c-2749bf22e7d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.428069 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5df59239-94df-44ef-9b1c-2749bf22e7d5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5df59239-94df-44ef-9b1c-2749bf22e7d5" (UID: "5df59239-94df-44ef-9b1c-2749bf22e7d5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.430784 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5df59239-94df-44ef-9b1c-2749bf22e7d5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5df59239-94df-44ef-9b1c-2749bf22e7d5" (UID: "5df59239-94df-44ef-9b1c-2749bf22e7d5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.473792 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwgq7\" (UniqueName: \"kubernetes.io/projected/acfc5351-8c75-4362-8e66-b9ade04d74eb-kube-api-access-jwgq7\") pod \"swift-storage-0\" (UID: \"acfc5351-8c75-4362-8e66-b9ade04d74eb\") " pod="openstack/swift-storage-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.473848 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"acfc5351-8c75-4362-8e66-b9ade04d74eb\") " pod="openstack/swift-storage-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.473889 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/acfc5351-8c75-4362-8e66-b9ade04d74eb-lock\") pod \"swift-storage-0\" (UID: \"acfc5351-8c75-4362-8e66-b9ade04d74eb\") " pod="openstack/swift-storage-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.474038 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acfc5351-8c75-4362-8e66-b9ade04d74eb-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"acfc5351-8c75-4362-8e66-b9ade04d74eb\") " pod="openstack/swift-storage-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.474156 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/acfc5351-8c75-4362-8e66-b9ade04d74eb-cache\") pod \"swift-storage-0\" (UID: \"acfc5351-8c75-4362-8e66-b9ade04d74eb\") " pod="openstack/swift-storage-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.474182 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/acfc5351-8c75-4362-8e66-b9ade04d74eb-etc-swift\") pod \"swift-storage-0\" (UID: \"acfc5351-8c75-4362-8e66-b9ade04d74eb\") " pod="openstack/swift-storage-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.474242 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5df59239-94df-44ef-9b1c-2749bf22e7d5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.474258 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5df59239-94df-44ef-9b1c-2749bf22e7d5-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.474272 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5rnb\" (UniqueName: \"kubernetes.io/projected/5df59239-94df-44ef-9b1c-2749bf22e7d5-kube-api-access-l5rnb\") on node \"crc\" DevicePath \"\"" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.474285 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5df59239-94df-44ef-9b1c-2749bf22e7d5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 15:55:58 crc kubenswrapper[4696]: E0318 15:55:58.474403 4696 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 15:55:58 crc kubenswrapper[4696]: E0318 15:55:58.474420 4696 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 15:55:58 crc kubenswrapper[4696]: E0318 15:55:58.474478 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acfc5351-8c75-4362-8e66-b9ade04d74eb-etc-swift podName:acfc5351-8c75-4362-8e66-b9ade04d74eb nodeName:}" failed. No retries permitted until 2026-03-18 15:55:58.97445561 +0000 UTC m=+1201.980629816 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/acfc5351-8c75-4362-8e66-b9ade04d74eb-etc-swift") pod "swift-storage-0" (UID: "acfc5351-8c75-4362-8e66-b9ade04d74eb") : configmap "swift-ring-files" not found Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.474848 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/acfc5351-8c75-4362-8e66-b9ade04d74eb-lock\") pod \"swift-storage-0\" (UID: \"acfc5351-8c75-4362-8e66-b9ade04d74eb\") " pod="openstack/swift-storage-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.475144 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/acfc5351-8c75-4362-8e66-b9ade04d74eb-cache\") pod \"swift-storage-0\" (UID: \"acfc5351-8c75-4362-8e66-b9ade04d74eb\") " pod="openstack/swift-storage-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.475544 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"acfc5351-8c75-4362-8e66-b9ade04d74eb\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/swift-storage-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.480500 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acfc5351-8c75-4362-8e66-b9ade04d74eb-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"acfc5351-8c75-4362-8e66-b9ade04d74eb\") " pod="openstack/swift-storage-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.494707 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwgq7\" (UniqueName: \"kubernetes.io/projected/acfc5351-8c75-4362-8e66-b9ade04d74eb-kube-api-access-jwgq7\") pod \"swift-storage-0\" (UID: \"acfc5351-8c75-4362-8e66-b9ade04d74eb\") " pod="openstack/swift-storage-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.501200 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.505437 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"acfc5351-8c75-4362-8e66-b9ade04d74eb\") " pod="openstack/swift-storage-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.774866 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-jks7s"] Mar 18 15:55:58 crc kubenswrapper[4696]: E0318 15:55:58.775868 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df59239-94df-44ef-9b1c-2749bf22e7d5" containerName="init" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.775898 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df59239-94df-44ef-9b1c-2749bf22e7d5" containerName="init" Mar 18 15:55:58 crc kubenswrapper[4696]: E0318 15:55:58.775962 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df59239-94df-44ef-9b1c-2749bf22e7d5" containerName="dnsmasq-dns" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.775970 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df59239-94df-44ef-9b1c-2749bf22e7d5" containerName="dnsmasq-dns" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.776159 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="5df59239-94df-44ef-9b1c-2749bf22e7d5" containerName="dnsmasq-dns" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.776873 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.780135 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.780341 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.781279 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.792724 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-jks7s"] Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.808679 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-lfms6" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.850600 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-jks7s"] Mar 18 15:55:58 crc kubenswrapper[4696]: E0318 15:55:58.851471 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-7mfs7 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-jks7s" podUID="466d699e-11c8-4c71-8f23-f21039558cdf" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.852996 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-xj2ch"] Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.854462 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.869255 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-xj2ch"] Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.880380 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/466d699e-11c8-4c71-8f23-f21039558cdf-ring-data-devices\") pod \"swift-ring-rebalance-jks7s\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.880443 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/466d699e-11c8-4c71-8f23-f21039558cdf-etc-swift\") pod \"swift-ring-rebalance-jks7s\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.880472 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mfs7\" (UniqueName: \"kubernetes.io/projected/466d699e-11c8-4c71-8f23-f21039558cdf-kube-api-access-7mfs7\") pod \"swift-ring-rebalance-jks7s\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.880507 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/466d699e-11c8-4c71-8f23-f21039558cdf-scripts\") pod \"swift-ring-rebalance-jks7s\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.880552 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466d699e-11c8-4c71-8f23-f21039558cdf-combined-ca-bundle\") pod \"swift-ring-rebalance-jks7s\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.880596 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/466d699e-11c8-4c71-8f23-f21039558cdf-dispersionconf\") pod \"swift-ring-rebalance-jks7s\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.880720 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/466d699e-11c8-4c71-8f23-f21039558cdf-swiftconf\") pod \"swift-ring-rebalance-jks7s\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.889340 4696 generic.go:334] "Generic (PLEG): container finished" podID="c4151e21-6506-415e-9dbe-3fe4389838b6" containerID="c1d07a961f3cc3785a93f226c44aa10c3e1288c80e6465e883ffdef7abb82568" exitCode=0 Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.889879 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" event={"ID":"c4151e21-6506-415e-9dbe-3fe4389838b6","Type":"ContainerDied","Data":"c1d07a961f3cc3785a93f226c44aa10c3e1288c80e6465e883ffdef7abb82568"} Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.889926 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" event={"ID":"c4151e21-6506-415e-9dbe-3fe4389838b6","Type":"ContainerStarted","Data":"c26e8846f0e06cdf3c47a2cbb873eca24e40de0cb6839d925c649c4d83e25535"} Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.900933 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.904676 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-wvlfj" event={"ID":"5df59239-94df-44ef-9b1c-2749bf22e7d5","Type":"ContainerDied","Data":"fc08011bdf9bb35280a69cc76c5fb39d2f1611281e1c02a44091c960d0b776a0"} Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.904760 4696 scope.go:117] "RemoveContainer" containerID="3ef25a1e5bcdc98b80ea46fbada615161f8bd2c7ebbb0d404572294f8ed015b2" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.905038 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.984166 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/466d699e-11c8-4c71-8f23-f21039558cdf-etc-swift\") pod \"swift-ring-rebalance-jks7s\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.984235 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62b4b14f-0ab3-4906-9c97-8c3092cd5379-scripts\") pod \"swift-ring-rebalance-xj2ch\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.984274 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mfs7\" (UniqueName: \"kubernetes.io/projected/466d699e-11c8-4c71-8f23-f21039558cdf-kube-api-access-7mfs7\") pod \"swift-ring-rebalance-jks7s\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.984454 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/466d699e-11c8-4c71-8f23-f21039558cdf-scripts\") pod \"swift-ring-rebalance-jks7s\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.984486 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/62b4b14f-0ab3-4906-9c97-8c3092cd5379-dispersionconf\") pod \"swift-ring-rebalance-xj2ch\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.984888 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466d699e-11c8-4c71-8f23-f21039558cdf-combined-ca-bundle\") pod \"swift-ring-rebalance-jks7s\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.985390 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/62b4b14f-0ab3-4906-9c97-8c3092cd5379-etc-swift\") pod \"swift-ring-rebalance-xj2ch\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.985467 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/466d699e-11c8-4c71-8f23-f21039558cdf-dispersionconf\") pod \"swift-ring-rebalance-jks7s\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.985498 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/62b4b14f-0ab3-4906-9c97-8c3092cd5379-ring-data-devices\") pod \"swift-ring-rebalance-xj2ch\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.985592 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/acfc5351-8c75-4362-8e66-b9ade04d74eb-etc-swift\") pod \"swift-storage-0\" (UID: \"acfc5351-8c75-4362-8e66-b9ade04d74eb\") " pod="openstack/swift-storage-0" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.985664 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/62b4b14f-0ab3-4906-9c97-8c3092cd5379-swiftconf\") pod \"swift-ring-rebalance-xj2ch\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.985841 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/466d699e-11c8-4c71-8f23-f21039558cdf-swiftconf\") pod \"swift-ring-rebalance-jks7s\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.986027 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b4b14f-0ab3-4906-9c97-8c3092cd5379-combined-ca-bundle\") pod \"swift-ring-rebalance-xj2ch\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.986062 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdfgb\" (UniqueName: \"kubernetes.io/projected/62b4b14f-0ab3-4906-9c97-8c3092cd5379-kube-api-access-rdfgb\") pod \"swift-ring-rebalance-xj2ch\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.986098 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/466d699e-11c8-4c71-8f23-f21039558cdf-ring-data-devices\") pod \"swift-ring-rebalance-jks7s\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.988712 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/466d699e-11c8-4c71-8f23-f21039558cdf-etc-swift\") pod \"swift-ring-rebalance-jks7s\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:58 crc kubenswrapper[4696]: E0318 15:55:58.989001 4696 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 15:55:58 crc kubenswrapper[4696]: E0318 15:55:58.989017 4696 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 15:55:58 crc kubenswrapper[4696]: E0318 15:55:58.989060 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acfc5351-8c75-4362-8e66-b9ade04d74eb-etc-swift podName:acfc5351-8c75-4362-8e66-b9ade04d74eb nodeName:}" failed. No retries permitted until 2026-03-18 15:55:59.989040644 +0000 UTC m=+1202.995214930 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/acfc5351-8c75-4362-8e66-b9ade04d74eb-etc-swift") pod "swift-storage-0" (UID: "acfc5351-8c75-4362-8e66-b9ade04d74eb") : configmap "swift-ring-files" not found Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.989120 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/466d699e-11c8-4c71-8f23-f21039558cdf-scripts\") pod \"swift-ring-rebalance-jks7s\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.989477 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/466d699e-11c8-4c71-8f23-f21039558cdf-ring-data-devices\") pod \"swift-ring-rebalance-jks7s\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.991225 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/466d699e-11c8-4c71-8f23-f21039558cdf-dispersionconf\") pod \"swift-ring-rebalance-jks7s\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.991840 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/466d699e-11c8-4c71-8f23-f21039558cdf-swiftconf\") pod \"swift-ring-rebalance-jks7s\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.991967 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466d699e-11c8-4c71-8f23-f21039558cdf-combined-ca-bundle\") pod \"swift-ring-rebalance-jks7s\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:58 crc kubenswrapper[4696]: I0318 15:55:58.997912 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.014335 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mfs7\" (UniqueName: \"kubernetes.io/projected/466d699e-11c8-4c71-8f23-f21039558cdf-kube-api-access-7mfs7\") pod \"swift-ring-rebalance-jks7s\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.079406 4696 scope.go:117] "RemoveContainer" containerID="694045afe111afe7a73e801e335caaa39530e65d13f8eefed57e7c2b55f03c1f" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.087784 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62b4b14f-0ab3-4906-9c97-8c3092cd5379-scripts\") pod \"swift-ring-rebalance-xj2ch\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.087855 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/62b4b14f-0ab3-4906-9c97-8c3092cd5379-dispersionconf\") pod \"swift-ring-rebalance-xj2ch\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.087893 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/62b4b14f-0ab3-4906-9c97-8c3092cd5379-etc-swift\") pod \"swift-ring-rebalance-xj2ch\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.087915 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/62b4b14f-0ab3-4906-9c97-8c3092cd5379-ring-data-devices\") pod \"swift-ring-rebalance-xj2ch\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.087968 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/62b4b14f-0ab3-4906-9c97-8c3092cd5379-swiftconf\") pod \"swift-ring-rebalance-xj2ch\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.088042 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b4b14f-0ab3-4906-9c97-8c3092cd5379-combined-ca-bundle\") pod \"swift-ring-rebalance-xj2ch\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.088062 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdfgb\" (UniqueName: \"kubernetes.io/projected/62b4b14f-0ab3-4906-9c97-8c3092cd5379-kube-api-access-rdfgb\") pod \"swift-ring-rebalance-xj2ch\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.089057 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62b4b14f-0ab3-4906-9c97-8c3092cd5379-scripts\") pod \"swift-ring-rebalance-xj2ch\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.090260 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/62b4b14f-0ab3-4906-9c97-8c3092cd5379-ring-data-devices\") pod \"swift-ring-rebalance-xj2ch\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.090894 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/62b4b14f-0ab3-4906-9c97-8c3092cd5379-etc-swift\") pod \"swift-ring-rebalance-xj2ch\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.092923 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.095651 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/62b4b14f-0ab3-4906-9c97-8c3092cd5379-dispersionconf\") pod \"swift-ring-rebalance-xj2ch\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.095765 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/62b4b14f-0ab3-4906-9c97-8c3092cd5379-swiftconf\") pod \"swift-ring-rebalance-xj2ch\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.095896 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b4b14f-0ab3-4906-9c97-8c3092cd5379-combined-ca-bundle\") pod \"swift-ring-rebalance-xj2ch\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.106903 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-wvlfj"] Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.109914 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdfgb\" (UniqueName: \"kubernetes.io/projected/62b4b14f-0ab3-4906-9c97-8c3092cd5379-kube-api-access-rdfgb\") pod \"swift-ring-rebalance-xj2ch\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.115880 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-wvlfj"] Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.189708 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/466d699e-11c8-4c71-8f23-f21039558cdf-ring-data-devices\") pod \"466d699e-11c8-4c71-8f23-f21039558cdf\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.189864 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mfs7\" (UniqueName: \"kubernetes.io/projected/466d699e-11c8-4c71-8f23-f21039558cdf-kube-api-access-7mfs7\") pod \"466d699e-11c8-4c71-8f23-f21039558cdf\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.190025 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/466d699e-11c8-4c71-8f23-f21039558cdf-swiftconf\") pod \"466d699e-11c8-4c71-8f23-f21039558cdf\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.190063 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/466d699e-11c8-4c71-8f23-f21039558cdf-etc-swift\") pod \"466d699e-11c8-4c71-8f23-f21039558cdf\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.190416 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/466d699e-11c8-4c71-8f23-f21039558cdf-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "466d699e-11c8-4c71-8f23-f21039558cdf" (UID: "466d699e-11c8-4c71-8f23-f21039558cdf"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.190572 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466d699e-11c8-4c71-8f23-f21039558cdf-combined-ca-bundle\") pod \"466d699e-11c8-4c71-8f23-f21039558cdf\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.190612 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/466d699e-11c8-4c71-8f23-f21039558cdf-dispersionconf\") pod \"466d699e-11c8-4c71-8f23-f21039558cdf\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.190688 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/466d699e-11c8-4c71-8f23-f21039558cdf-scripts\") pod \"466d699e-11c8-4c71-8f23-f21039558cdf\" (UID: \"466d699e-11c8-4c71-8f23-f21039558cdf\") " Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.190913 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/466d699e-11c8-4c71-8f23-f21039558cdf-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "466d699e-11c8-4c71-8f23-f21039558cdf" (UID: "466d699e-11c8-4c71-8f23-f21039558cdf"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.191260 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/466d699e-11c8-4c71-8f23-f21039558cdf-scripts" (OuterVolumeSpecName: "scripts") pod "466d699e-11c8-4c71-8f23-f21039558cdf" (UID: "466d699e-11c8-4c71-8f23-f21039558cdf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.192232 4696 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/466d699e-11c8-4c71-8f23-f21039558cdf-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.192260 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/466d699e-11c8-4c71-8f23-f21039558cdf-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.192271 4696 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/466d699e-11c8-4c71-8f23-f21039558cdf-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.193823 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/466d699e-11c8-4c71-8f23-f21039558cdf-kube-api-access-7mfs7" (OuterVolumeSpecName: "kube-api-access-7mfs7") pod "466d699e-11c8-4c71-8f23-f21039558cdf" (UID: "466d699e-11c8-4c71-8f23-f21039558cdf"). InnerVolumeSpecName "kube-api-access-7mfs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.194898 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466d699e-11c8-4c71-8f23-f21039558cdf-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "466d699e-11c8-4c71-8f23-f21039558cdf" (UID: "466d699e-11c8-4c71-8f23-f21039558cdf"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.194972 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466d699e-11c8-4c71-8f23-f21039558cdf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "466d699e-11c8-4c71-8f23-f21039558cdf" (UID: "466d699e-11c8-4c71-8f23-f21039558cdf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.195416 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466d699e-11c8-4c71-8f23-f21039558cdf-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "466d699e-11c8-4c71-8f23-f21039558cdf" (UID: "466d699e-11c8-4c71-8f23-f21039558cdf"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.293495 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mfs7\" (UniqueName: \"kubernetes.io/projected/466d699e-11c8-4c71-8f23-f21039558cdf-kube-api-access-7mfs7\") on node \"crc\" DevicePath \"\"" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.293564 4696 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/466d699e-11c8-4c71-8f23-f21039558cdf-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.293578 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466d699e-11c8-4c71-8f23-f21039558cdf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.293590 4696 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/466d699e-11c8-4c71-8f23-f21039558cdf-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.387862 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.675310 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5df59239-94df-44ef-9b1c-2749bf22e7d5" path="/var/lib/kubelet/pods/5df59239-94df-44ef-9b1c-2749bf22e7d5/volumes" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.924343 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6d256733-b9f7-484d-873a-b77e062f63c8","Type":"ContainerStarted","Data":"0676fac1c2d4009aa85a0443c04dd459ef7d06b94bde3ee6cbdc50191e2ae7a5"} Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.933634 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jks7s" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.933633 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" event={"ID":"c4151e21-6506-415e-9dbe-3fe4389838b6","Type":"ContainerStarted","Data":"a8bd79c572b1a15ef5a9fbd43e71eaa9854e895448bcf5c07abc87817cd7fa22"} Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.934205 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.962283 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-xj2ch"] Mar 18 15:55:59 crc kubenswrapper[4696]: I0318 15:55:59.970486 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" podStartSLOduration=2.970466756 podStartE2EDuration="2.970466756s" podCreationTimestamp="2026-03-18 15:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:55:59.952199778 +0000 UTC m=+1202.958374004" watchObservedRunningTime="2026-03-18 15:55:59.970466756 +0000 UTC m=+1202.976640962" Mar 18 15:56:00 crc kubenswrapper[4696]: I0318 15:56:00.010339 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/acfc5351-8c75-4362-8e66-b9ade04d74eb-etc-swift\") pod \"swift-storage-0\" (UID: \"acfc5351-8c75-4362-8e66-b9ade04d74eb\") " pod="openstack/swift-storage-0" Mar 18 15:56:00 crc kubenswrapper[4696]: E0318 15:56:00.010620 4696 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 15:56:00 crc kubenswrapper[4696]: E0318 15:56:00.010638 4696 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 15:56:00 crc kubenswrapper[4696]: E0318 15:56:00.010683 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acfc5351-8c75-4362-8e66-b9ade04d74eb-etc-swift podName:acfc5351-8c75-4362-8e66-b9ade04d74eb nodeName:}" failed. No retries permitted until 2026-03-18 15:56:02.010668073 +0000 UTC m=+1205.016842279 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/acfc5351-8c75-4362-8e66-b9ade04d74eb-etc-swift") pod "swift-storage-0" (UID: "acfc5351-8c75-4362-8e66-b9ade04d74eb") : configmap "swift-ring-files" not found Mar 18 15:56:00 crc kubenswrapper[4696]: I0318 15:56:00.011026 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-jks7s"] Mar 18 15:56:00 crc kubenswrapper[4696]: I0318 15:56:00.019997 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-jks7s"] Mar 18 15:56:00 crc kubenswrapper[4696]: I0318 15:56:00.137637 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564156-8bzbz"] Mar 18 15:56:00 crc kubenswrapper[4696]: I0318 15:56:00.138710 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564156-8bzbz" Mar 18 15:56:00 crc kubenswrapper[4696]: I0318 15:56:00.141187 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:56:00 crc kubenswrapper[4696]: I0318 15:56:00.141512 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 15:56:00 crc kubenswrapper[4696]: I0318 15:56:00.142297 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:56:00 crc kubenswrapper[4696]: I0318 15:56:00.153058 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564156-8bzbz"] Mar 18 15:56:00 crc kubenswrapper[4696]: I0318 15:56:00.213671 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfklp\" (UniqueName: \"kubernetes.io/projected/bc7a7cdc-3686-4af9-89ab-fea81132767c-kube-api-access-kfklp\") pod \"auto-csr-approver-29564156-8bzbz\" (UID: \"bc7a7cdc-3686-4af9-89ab-fea81132767c\") " pod="openshift-infra/auto-csr-approver-29564156-8bzbz" Mar 18 15:56:00 crc kubenswrapper[4696]: I0318 15:56:00.315169 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfklp\" (UniqueName: \"kubernetes.io/projected/bc7a7cdc-3686-4af9-89ab-fea81132767c-kube-api-access-kfklp\") pod \"auto-csr-approver-29564156-8bzbz\" (UID: \"bc7a7cdc-3686-4af9-89ab-fea81132767c\") " pod="openshift-infra/auto-csr-approver-29564156-8bzbz" Mar 18 15:56:00 crc kubenswrapper[4696]: I0318 15:56:00.338049 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfklp\" (UniqueName: \"kubernetes.io/projected/bc7a7cdc-3686-4af9-89ab-fea81132767c-kube-api-access-kfklp\") pod \"auto-csr-approver-29564156-8bzbz\" (UID: \"bc7a7cdc-3686-4af9-89ab-fea81132767c\") " pod="openshift-infra/auto-csr-approver-29564156-8bzbz" Mar 18 15:56:00 crc kubenswrapper[4696]: I0318 15:56:00.458723 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564156-8bzbz" Mar 18 15:56:00 crc kubenswrapper[4696]: I0318 15:56:00.946370 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xj2ch" event={"ID":"62b4b14f-0ab3-4906-9c97-8c3092cd5379","Type":"ContainerStarted","Data":"422eeff88d38f4b351db4c5bc6417ba3cbe7238caabee37ffa619764c6cdff29"} Mar 18 15:56:01 crc kubenswrapper[4696]: I0318 15:56:01.034058 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564156-8bzbz"] Mar 18 15:56:01 crc kubenswrapper[4696]: I0318 15:56:01.616909 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="466d699e-11c8-4c71-8f23-f21039558cdf" path="/var/lib/kubelet/pods/466d699e-11c8-4c71-8f23-f21039558cdf/volumes" Mar 18 15:56:01 crc kubenswrapper[4696]: I0318 15:56:01.955670 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6d256733-b9f7-484d-873a-b77e062f63c8","Type":"ContainerStarted","Data":"88a5a858d490e8eba16f722162d4a95cf2b85f260631d42f6a4a448d6b3be8f4"} Mar 18 15:56:01 crc kubenswrapper[4696]: I0318 15:56:01.955723 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6d256733-b9f7-484d-873a-b77e062f63c8","Type":"ContainerStarted","Data":"4e445403eabb4bb7005e72fc6c101eb62cc25d2316fc849532b17a3e60351624"} Mar 18 15:56:01 crc kubenswrapper[4696]: I0318 15:56:01.957177 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 18 15:56:01 crc kubenswrapper[4696]: I0318 15:56:01.958443 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564156-8bzbz" event={"ID":"bc7a7cdc-3686-4af9-89ab-fea81132767c","Type":"ContainerStarted","Data":"3c26c09e13a86705739ce56c820d35bdaecd46d91996cf9b9eb6a05f04ebdf79"} Mar 18 15:56:02 crc kubenswrapper[4696]: I0318 15:56:02.048664 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/acfc5351-8c75-4362-8e66-b9ade04d74eb-etc-swift\") pod \"swift-storage-0\" (UID: \"acfc5351-8c75-4362-8e66-b9ade04d74eb\") " pod="openstack/swift-storage-0" Mar 18 15:56:02 crc kubenswrapper[4696]: E0318 15:56:02.048885 4696 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 15:56:02 crc kubenswrapper[4696]: E0318 15:56:02.048919 4696 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 15:56:02 crc kubenswrapper[4696]: E0318 15:56:02.048993 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acfc5351-8c75-4362-8e66-b9ade04d74eb-etc-swift podName:acfc5351-8c75-4362-8e66-b9ade04d74eb nodeName:}" failed. No retries permitted until 2026-03-18 15:56:06.048969726 +0000 UTC m=+1209.055143932 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/acfc5351-8c75-4362-8e66-b9ade04d74eb-etc-swift") pod "swift-storage-0" (UID: "acfc5351-8c75-4362-8e66-b9ade04d74eb") : configmap "swift-ring-files" not found Mar 18 15:56:03 crc kubenswrapper[4696]: I0318 15:56:03.556596 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 18 15:56:03 crc kubenswrapper[4696]: I0318 15:56:03.557107 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 18 15:56:03 crc kubenswrapper[4696]: I0318 15:56:03.651890 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 18 15:56:03 crc kubenswrapper[4696]: I0318 15:56:03.688730 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.4766193149999998 podStartE2EDuration="5.688703063s" podCreationTimestamp="2026-03-18 15:55:58 +0000 UTC" firstStartedPulling="2026-03-18 15:55:59.008736177 +0000 UTC m=+1202.014910383" lastFinishedPulling="2026-03-18 15:56:01.220819925 +0000 UTC m=+1204.226994131" observedRunningTime="2026-03-18 15:56:01.981329731 +0000 UTC m=+1204.987503947" watchObservedRunningTime="2026-03-18 15:56:03.688703063 +0000 UTC m=+1206.694877269" Mar 18 15:56:04 crc kubenswrapper[4696]: I0318 15:56:04.103076 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 18 15:56:04 crc kubenswrapper[4696]: I0318 15:56:04.812599 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 18 15:56:04 crc kubenswrapper[4696]: I0318 15:56:04.813164 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 18 15:56:04 crc kubenswrapper[4696]: I0318 15:56:04.936081 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-161d-account-create-update-cw9fd"] Mar 18 15:56:04 crc kubenswrapper[4696]: I0318 15:56:04.937776 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-161d-account-create-update-cw9fd" Mar 18 15:56:04 crc kubenswrapper[4696]: I0318 15:56:04.940347 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 18 15:56:04 crc kubenswrapper[4696]: I0318 15:56:04.946699 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 18 15:56:04 crc kubenswrapper[4696]: I0318 15:56:04.946854 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-f8nj6"] Mar 18 15:56:04 crc kubenswrapper[4696]: I0318 15:56:04.948328 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-f8nj6" Mar 18 15:56:04 crc kubenswrapper[4696]: I0318 15:56:04.962118 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-161d-account-create-update-cw9fd"] Mar 18 15:56:04 crc kubenswrapper[4696]: I0318 15:56:04.982068 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-f8nj6"] Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.011595 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564156-8bzbz" event={"ID":"bc7a7cdc-3686-4af9-89ab-fea81132767c","Type":"ContainerStarted","Data":"ba3ab3f011f8667d0271bb0f763da555ddb50262d2ff5e11786bcf5016594a64"} Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.013286 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4btbj\" (UniqueName: \"kubernetes.io/projected/2662d61f-9289-4c66-8823-4bb09d86dd75-kube-api-access-4btbj\") pod \"glance-161d-account-create-update-cw9fd\" (UID: \"2662d61f-9289-4c66-8823-4bb09d86dd75\") " pod="openstack/glance-161d-account-create-update-cw9fd" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.013374 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7-operator-scripts\") pod \"glance-db-create-f8nj6\" (UID: \"4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7\") " pod="openstack/glance-db-create-f8nj6" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.013409 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2662d61f-9289-4c66-8823-4bb09d86dd75-operator-scripts\") pod \"glance-161d-account-create-update-cw9fd\" (UID: \"2662d61f-9289-4c66-8823-4bb09d86dd75\") " pod="openstack/glance-161d-account-create-update-cw9fd" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.013514 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm94s\" (UniqueName: \"kubernetes.io/projected/4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7-kube-api-access-tm94s\") pod \"glance-db-create-f8nj6\" (UID: \"4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7\") " pod="openstack/glance-db-create-f8nj6" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.017010 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xj2ch" event={"ID":"62b4b14f-0ab3-4906-9c97-8c3092cd5379","Type":"ContainerStarted","Data":"8075b95a7f44c9f907657d8ab536470921854bc1d4768adb9f8f8686c5b4002c"} Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.033982 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564156-8bzbz" podStartSLOduration=1.8287555580000001 podStartE2EDuration="5.03396225s" podCreationTimestamp="2026-03-18 15:56:00 +0000 UTC" firstStartedPulling="2026-03-18 15:56:01.18313093 +0000 UTC m=+1204.189305136" lastFinishedPulling="2026-03-18 15:56:04.388337622 +0000 UTC m=+1207.394511828" observedRunningTime="2026-03-18 15:56:05.029284423 +0000 UTC m=+1208.035458629" watchObservedRunningTime="2026-03-18 15:56:05.03396225 +0000 UTC m=+1208.040136456" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.051024 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-xj2ch" podStartSLOduration=2.625343855 podStartE2EDuration="7.051003467s" podCreationTimestamp="2026-03-18 15:55:58 +0000 UTC" firstStartedPulling="2026-03-18 15:55:59.964200479 +0000 UTC m=+1202.970374695" lastFinishedPulling="2026-03-18 15:56:04.389860101 +0000 UTC m=+1207.396034307" observedRunningTime="2026-03-18 15:56:05.047176661 +0000 UTC m=+1208.053350867" watchObservedRunningTime="2026-03-18 15:56:05.051003467 +0000 UTC m=+1208.057177673" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.105905 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.114749 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4btbj\" (UniqueName: \"kubernetes.io/projected/2662d61f-9289-4c66-8823-4bb09d86dd75-kube-api-access-4btbj\") pod \"glance-161d-account-create-update-cw9fd\" (UID: \"2662d61f-9289-4c66-8823-4bb09d86dd75\") " pod="openstack/glance-161d-account-create-update-cw9fd" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.114899 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7-operator-scripts\") pod \"glance-db-create-f8nj6\" (UID: \"4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7\") " pod="openstack/glance-db-create-f8nj6" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.114944 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2662d61f-9289-4c66-8823-4bb09d86dd75-operator-scripts\") pod \"glance-161d-account-create-update-cw9fd\" (UID: \"2662d61f-9289-4c66-8823-4bb09d86dd75\") " pod="openstack/glance-161d-account-create-update-cw9fd" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.115126 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm94s\" (UniqueName: \"kubernetes.io/projected/4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7-kube-api-access-tm94s\") pod \"glance-db-create-f8nj6\" (UID: \"4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7\") " pod="openstack/glance-db-create-f8nj6" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.116178 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7-operator-scripts\") pod \"glance-db-create-f8nj6\" (UID: \"4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7\") " pod="openstack/glance-db-create-f8nj6" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.116729 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2662d61f-9289-4c66-8823-4bb09d86dd75-operator-scripts\") pod \"glance-161d-account-create-update-cw9fd\" (UID: \"2662d61f-9289-4c66-8823-4bb09d86dd75\") " pod="openstack/glance-161d-account-create-update-cw9fd" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.145347 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm94s\" (UniqueName: \"kubernetes.io/projected/4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7-kube-api-access-tm94s\") pod \"glance-db-create-f8nj6\" (UID: \"4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7\") " pod="openstack/glance-db-create-f8nj6" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.147567 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4btbj\" (UniqueName: \"kubernetes.io/projected/2662d61f-9289-4c66-8823-4bb09d86dd75-kube-api-access-4btbj\") pod \"glance-161d-account-create-update-cw9fd\" (UID: \"2662d61f-9289-4c66-8823-4bb09d86dd75\") " pod="openstack/glance-161d-account-create-update-cw9fd" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.258158 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-161d-account-create-update-cw9fd" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.274881 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-f8nj6" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.698572 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-jgf82"] Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.700328 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jgf82" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.711409 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jgf82"] Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.771785 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-161d-account-create-update-cw9fd"] Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.805331 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d8de-account-create-update-r4hss"] Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.813093 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d8de-account-create-update-r4hss" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.815860 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.818924 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d8de-account-create-update-r4hss"] Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.829846 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90c15c3a-cbbf-4ebf-b594-1782495f18db-operator-scripts\") pod \"keystone-db-create-jgf82\" (UID: \"90c15c3a-cbbf-4ebf-b594-1782495f18db\") " pod="openstack/keystone-db-create-jgf82" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.829923 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wdf9\" (UniqueName: \"kubernetes.io/projected/90c15c3a-cbbf-4ebf-b594-1782495f18db-kube-api-access-9wdf9\") pod \"keystone-db-create-jgf82\" (UID: \"90c15c3a-cbbf-4ebf-b594-1782495f18db\") " pod="openstack/keystone-db-create-jgf82" Mar 18 15:56:05 crc kubenswrapper[4696]: W0318 15:56:05.847788 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d2d55c5_26d3_4b9c_9fb4_8a56baf576b7.slice/crio-752d2dfa3755464d14245d8afa4a6290d95493b4538c2476073c8821f47ad4c8 WatchSource:0}: Error finding container 752d2dfa3755464d14245d8afa4a6290d95493b4538c2476073c8821f47ad4c8: Status 404 returned error can't find the container with id 752d2dfa3755464d14245d8afa4a6290d95493b4538c2476073c8821f47ad4c8 Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.851381 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-f8nj6"] Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.925826 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-th84q"] Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.927174 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-th84q" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.934508 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wdf9\" (UniqueName: \"kubernetes.io/projected/90c15c3a-cbbf-4ebf-b594-1782495f18db-kube-api-access-9wdf9\") pod \"keystone-db-create-jgf82\" (UID: \"90c15c3a-cbbf-4ebf-b594-1782495f18db\") " pod="openstack/keystone-db-create-jgf82" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.934706 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m4qh\" (UniqueName: \"kubernetes.io/projected/b0ad9bfa-558d-440b-9297-c145b93193c2-kube-api-access-6m4qh\") pod \"keystone-d8de-account-create-update-r4hss\" (UID: \"b0ad9bfa-558d-440b-9297-c145b93193c2\") " pod="openstack/keystone-d8de-account-create-update-r4hss" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.934837 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0ad9bfa-558d-440b-9297-c145b93193c2-operator-scripts\") pod \"keystone-d8de-account-create-update-r4hss\" (UID: \"b0ad9bfa-558d-440b-9297-c145b93193c2\") " pod="openstack/keystone-d8de-account-create-update-r4hss" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.935313 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90c15c3a-cbbf-4ebf-b594-1782495f18db-operator-scripts\") pod \"keystone-db-create-jgf82\" (UID: \"90c15c3a-cbbf-4ebf-b594-1782495f18db\") " pod="openstack/keystone-db-create-jgf82" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.936351 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90c15c3a-cbbf-4ebf-b594-1782495f18db-operator-scripts\") pod \"keystone-db-create-jgf82\" (UID: \"90c15c3a-cbbf-4ebf-b594-1782495f18db\") " pod="openstack/keystone-db-create-jgf82" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.964669 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wdf9\" (UniqueName: \"kubernetes.io/projected/90c15c3a-cbbf-4ebf-b594-1782495f18db-kube-api-access-9wdf9\") pod \"keystone-db-create-jgf82\" (UID: \"90c15c3a-cbbf-4ebf-b594-1782495f18db\") " pod="openstack/keystone-db-create-jgf82" Mar 18 15:56:05 crc kubenswrapper[4696]: I0318 15:56:05.989108 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-th84q"] Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.024495 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jgf82" Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.038478 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-161d-account-create-update-cw9fd" event={"ID":"2662d61f-9289-4c66-8823-4bb09d86dd75","Type":"ContainerStarted","Data":"cbd7576e5c32160e36fbd41b68ba1b6d3b8aaa3f4f5ad13e3aa1fe8a0156e486"} Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.039782 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfsj4\" (UniqueName: \"kubernetes.io/projected/8a302450-56dc-4388-8796-f657954f0e25-kube-api-access-xfsj4\") pod \"placement-db-create-th84q\" (UID: \"8a302450-56dc-4388-8796-f657954f0e25\") " pod="openstack/placement-db-create-th84q" Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.039913 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a302450-56dc-4388-8796-f657954f0e25-operator-scripts\") pod \"placement-db-create-th84q\" (UID: \"8a302450-56dc-4388-8796-f657954f0e25\") " pod="openstack/placement-db-create-th84q" Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.040110 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m4qh\" (UniqueName: \"kubernetes.io/projected/b0ad9bfa-558d-440b-9297-c145b93193c2-kube-api-access-6m4qh\") pod \"keystone-d8de-account-create-update-r4hss\" (UID: \"b0ad9bfa-558d-440b-9297-c145b93193c2\") " pod="openstack/keystone-d8de-account-create-update-r4hss" Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.040203 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0ad9bfa-558d-440b-9297-c145b93193c2-operator-scripts\") pod \"keystone-d8de-account-create-update-r4hss\" (UID: \"b0ad9bfa-558d-440b-9297-c145b93193c2\") " pod="openstack/keystone-d8de-account-create-update-r4hss" Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.041294 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-f8nj6" event={"ID":"4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7","Type":"ContainerStarted","Data":"752d2dfa3755464d14245d8afa4a6290d95493b4538c2476073c8821f47ad4c8"} Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.041460 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0ad9bfa-558d-440b-9297-c145b93193c2-operator-scripts\") pod \"keystone-d8de-account-create-update-r4hss\" (UID: \"b0ad9bfa-558d-440b-9297-c145b93193c2\") " pod="openstack/keystone-d8de-account-create-update-r4hss" Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.043361 4696 generic.go:334] "Generic (PLEG): container finished" podID="bc7a7cdc-3686-4af9-89ab-fea81132767c" containerID="ba3ab3f011f8667d0271bb0f763da555ddb50262d2ff5e11786bcf5016594a64" exitCode=0 Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.043497 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564156-8bzbz" event={"ID":"bc7a7cdc-3686-4af9-89ab-fea81132767c","Type":"ContainerDied","Data":"ba3ab3f011f8667d0271bb0f763da555ddb50262d2ff5e11786bcf5016594a64"} Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.065512 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m4qh\" (UniqueName: \"kubernetes.io/projected/b0ad9bfa-558d-440b-9297-c145b93193c2-kube-api-access-6m4qh\") pod \"keystone-d8de-account-create-update-r4hss\" (UID: \"b0ad9bfa-558d-440b-9297-c145b93193c2\") " pod="openstack/keystone-d8de-account-create-update-r4hss" Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.114999 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-effc-account-create-update-phf9v"] Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.116806 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-effc-account-create-update-phf9v" Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.119222 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.128019 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-effc-account-create-update-phf9v"] Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.141735 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfsj4\" (UniqueName: \"kubernetes.io/projected/8a302450-56dc-4388-8796-f657954f0e25-kube-api-access-xfsj4\") pod \"placement-db-create-th84q\" (UID: \"8a302450-56dc-4388-8796-f657954f0e25\") " pod="openstack/placement-db-create-th84q" Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.141773 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a302450-56dc-4388-8796-f657954f0e25-operator-scripts\") pod \"placement-db-create-th84q\" (UID: \"8a302450-56dc-4388-8796-f657954f0e25\") " pod="openstack/placement-db-create-th84q" Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.141882 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/acfc5351-8c75-4362-8e66-b9ade04d74eb-etc-swift\") pod \"swift-storage-0\" (UID: \"acfc5351-8c75-4362-8e66-b9ade04d74eb\") " pod="openstack/swift-storage-0" Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.144249 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a302450-56dc-4388-8796-f657954f0e25-operator-scripts\") pod \"placement-db-create-th84q\" (UID: \"8a302450-56dc-4388-8796-f657954f0e25\") " pod="openstack/placement-db-create-th84q" Mar 18 15:56:06 crc kubenswrapper[4696]: E0318 15:56:06.145389 4696 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 15:56:06 crc kubenswrapper[4696]: E0318 15:56:06.145421 4696 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 15:56:06 crc kubenswrapper[4696]: E0318 15:56:06.145471 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acfc5351-8c75-4362-8e66-b9ade04d74eb-etc-swift podName:acfc5351-8c75-4362-8e66-b9ade04d74eb nodeName:}" failed. No retries permitted until 2026-03-18 15:56:14.145452281 +0000 UTC m=+1217.151626487 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/acfc5351-8c75-4362-8e66-b9ade04d74eb-etc-swift") pod "swift-storage-0" (UID: "acfc5351-8c75-4362-8e66-b9ade04d74eb") : configmap "swift-ring-files" not found Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.168394 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfsj4\" (UniqueName: \"kubernetes.io/projected/8a302450-56dc-4388-8796-f657954f0e25-kube-api-access-xfsj4\") pod \"placement-db-create-th84q\" (UID: \"8a302450-56dc-4388-8796-f657954f0e25\") " pod="openstack/placement-db-create-th84q" Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.243841 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sjvx\" (UniqueName: \"kubernetes.io/projected/a6d2b900-71d4-4dfd-bfda-07d44f39ee48-kube-api-access-7sjvx\") pod \"placement-effc-account-create-update-phf9v\" (UID: \"a6d2b900-71d4-4dfd-bfda-07d44f39ee48\") " pod="openstack/placement-effc-account-create-update-phf9v" Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.243935 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6d2b900-71d4-4dfd-bfda-07d44f39ee48-operator-scripts\") pod \"placement-effc-account-create-update-phf9v\" (UID: \"a6d2b900-71d4-4dfd-bfda-07d44f39ee48\") " pod="openstack/placement-effc-account-create-update-phf9v" Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.287738 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d8de-account-create-update-r4hss" Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.294054 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-th84q" Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.346968 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sjvx\" (UniqueName: \"kubernetes.io/projected/a6d2b900-71d4-4dfd-bfda-07d44f39ee48-kube-api-access-7sjvx\") pod \"placement-effc-account-create-update-phf9v\" (UID: \"a6d2b900-71d4-4dfd-bfda-07d44f39ee48\") " pod="openstack/placement-effc-account-create-update-phf9v" Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.347028 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6d2b900-71d4-4dfd-bfda-07d44f39ee48-operator-scripts\") pod \"placement-effc-account-create-update-phf9v\" (UID: \"a6d2b900-71d4-4dfd-bfda-07d44f39ee48\") " pod="openstack/placement-effc-account-create-update-phf9v" Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.347832 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6d2b900-71d4-4dfd-bfda-07d44f39ee48-operator-scripts\") pod \"placement-effc-account-create-update-phf9v\" (UID: \"a6d2b900-71d4-4dfd-bfda-07d44f39ee48\") " pod="openstack/placement-effc-account-create-update-phf9v" Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.365529 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sjvx\" (UniqueName: \"kubernetes.io/projected/a6d2b900-71d4-4dfd-bfda-07d44f39ee48-kube-api-access-7sjvx\") pod \"placement-effc-account-create-update-phf9v\" (UID: \"a6d2b900-71d4-4dfd-bfda-07d44f39ee48\") " pod="openstack/placement-effc-account-create-update-phf9v" Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.440956 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-effc-account-create-update-phf9v" Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.570803 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jgf82"] Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.873268 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-th84q"] Mar 18 15:56:06 crc kubenswrapper[4696]: W0318 15:56:06.876650 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a302450_56dc_4388_8796_f657954f0e25.slice/crio-1b45c19660cbc92a2487eae831b564b505c260d741ae18a1c2af3c519e7f1632 WatchSource:0}: Error finding container 1b45c19660cbc92a2487eae831b564b505c260d741ae18a1c2af3c519e7f1632: Status 404 returned error can't find the container with id 1b45c19660cbc92a2487eae831b564b505c260d741ae18a1c2af3c519e7f1632 Mar 18 15:56:06 crc kubenswrapper[4696]: I0318 15:56:06.986666 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d8de-account-create-update-r4hss"] Mar 18 15:56:06 crc kubenswrapper[4696]: W0318 15:56:06.991404 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0ad9bfa_558d_440b_9297_c145b93193c2.slice/crio-bf87a6398c5ddeece4762b8079703389237802059cda6b3fc1be1335588fa463 WatchSource:0}: Error finding container bf87a6398c5ddeece4762b8079703389237802059cda6b3fc1be1335588fa463: Status 404 returned error can't find the container with id bf87a6398c5ddeece4762b8079703389237802059cda6b3fc1be1335588fa463 Mar 18 15:56:07 crc kubenswrapper[4696]: I0318 15:56:07.058782 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-effc-account-create-update-phf9v"] Mar 18 15:56:07 crc kubenswrapper[4696]: I0318 15:56:07.064571 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d8de-account-create-update-r4hss" event={"ID":"b0ad9bfa-558d-440b-9297-c145b93193c2","Type":"ContainerStarted","Data":"bf87a6398c5ddeece4762b8079703389237802059cda6b3fc1be1335588fa463"} Mar 18 15:56:07 crc kubenswrapper[4696]: I0318 15:56:07.070136 4696 generic.go:334] "Generic (PLEG): container finished" podID="2662d61f-9289-4c66-8823-4bb09d86dd75" containerID="3774ac525944a9276f9bf8db749c93c0b38edc378c7ed2aeaa398d6922445351" exitCode=0 Mar 18 15:56:07 crc kubenswrapper[4696]: I0318 15:56:07.070227 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-161d-account-create-update-cw9fd" event={"ID":"2662d61f-9289-4c66-8823-4bb09d86dd75","Type":"ContainerDied","Data":"3774ac525944a9276f9bf8db749c93c0b38edc378c7ed2aeaa398d6922445351"} Mar 18 15:56:07 crc kubenswrapper[4696]: W0318 15:56:07.072998 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6d2b900_71d4_4dfd_bfda_07d44f39ee48.slice/crio-6ff4701ba1206b637da34358d405df84e33e2356ad21b93e2b4052854ca23586 WatchSource:0}: Error finding container 6ff4701ba1206b637da34358d405df84e33e2356ad21b93e2b4052854ca23586: Status 404 returned error can't find the container with id 6ff4701ba1206b637da34358d405df84e33e2356ad21b93e2b4052854ca23586 Mar 18 15:56:07 crc kubenswrapper[4696]: I0318 15:56:07.076783 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-th84q" event={"ID":"8a302450-56dc-4388-8796-f657954f0e25","Type":"ContainerStarted","Data":"a5995040a71b2e41a26358367386a0bcac58f1a210932ca456db70d802245f37"} Mar 18 15:56:07 crc kubenswrapper[4696]: I0318 15:56:07.076827 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-th84q" event={"ID":"8a302450-56dc-4388-8796-f657954f0e25","Type":"ContainerStarted","Data":"1b45c19660cbc92a2487eae831b564b505c260d741ae18a1c2af3c519e7f1632"} Mar 18 15:56:07 crc kubenswrapper[4696]: I0318 15:56:07.081101 4696 generic.go:334] "Generic (PLEG): container finished" podID="4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7" containerID="007251e633b476c0d4d6e5a3beeff4f57dbfe03715b5d0c6ee78022b6c342c9b" exitCode=0 Mar 18 15:56:07 crc kubenswrapper[4696]: I0318 15:56:07.081252 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-f8nj6" event={"ID":"4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7","Type":"ContainerDied","Data":"007251e633b476c0d4d6e5a3beeff4f57dbfe03715b5d0c6ee78022b6c342c9b"} Mar 18 15:56:07 crc kubenswrapper[4696]: I0318 15:56:07.084621 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jgf82" event={"ID":"90c15c3a-cbbf-4ebf-b594-1782495f18db","Type":"ContainerStarted","Data":"3dadb71be7e3477427169d4d6cf08f73bc5befa344475f97d547fa61533071d3"} Mar 18 15:56:07 crc kubenswrapper[4696]: I0318 15:56:07.084815 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jgf82" event={"ID":"90c15c3a-cbbf-4ebf-b594-1782495f18db","Type":"ContainerStarted","Data":"d4d58efb65cb8998709a88b6c0cdf2d27fbd2abc8633e21c40ed35a158cd783b"} Mar 18 15:56:07 crc kubenswrapper[4696]: I0318 15:56:07.134845 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-th84q" podStartSLOduration=2.134825641 podStartE2EDuration="2.134825641s" podCreationTimestamp="2026-03-18 15:56:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:56:07.133718344 +0000 UTC m=+1210.139892540" watchObservedRunningTime="2026-03-18 15:56:07.134825641 +0000 UTC m=+1210.140999867" Mar 18 15:56:07 crc kubenswrapper[4696]: I0318 15:56:07.164437 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-jgf82" podStartSLOduration=2.164420843 podStartE2EDuration="2.164420843s" podCreationTimestamp="2026-03-18 15:56:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:56:07.164166347 +0000 UTC m=+1210.170340553" watchObservedRunningTime="2026-03-18 15:56:07.164420843 +0000 UTC m=+1210.170595049" Mar 18 15:56:07 crc kubenswrapper[4696]: I0318 15:56:07.383147 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564156-8bzbz" Mar 18 15:56:07 crc kubenswrapper[4696]: I0318 15:56:07.450006 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" Mar 18 15:56:07 crc kubenswrapper[4696]: I0318 15:56:07.484008 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfklp\" (UniqueName: \"kubernetes.io/projected/bc7a7cdc-3686-4af9-89ab-fea81132767c-kube-api-access-kfklp\") pod \"bc7a7cdc-3686-4af9-89ab-fea81132767c\" (UID: \"bc7a7cdc-3686-4af9-89ab-fea81132767c\") " Mar 18 15:56:07 crc kubenswrapper[4696]: I0318 15:56:07.492145 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc7a7cdc-3686-4af9-89ab-fea81132767c-kube-api-access-kfklp" (OuterVolumeSpecName: "kube-api-access-kfklp") pod "bc7a7cdc-3686-4af9-89ab-fea81132767c" (UID: "bc7a7cdc-3686-4af9-89ab-fea81132767c"). InnerVolumeSpecName "kube-api-access-kfklp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:07 crc kubenswrapper[4696]: I0318 15:56:07.518309 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-lfms6"] Mar 18 15:56:07 crc kubenswrapper[4696]: I0318 15:56:07.518721 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-lfms6" podUID="5971f2ec-035d-482c-8d75-eb8af348a864" containerName="dnsmasq-dns" containerID="cri-o://82b976fe11924f48a0159d2fada52277301e8e608e0d79a2bd68980aa9e8fbb4" gracePeriod=10 Mar 18 15:56:07 crc kubenswrapper[4696]: I0318 15:56:07.586588 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfklp\" (UniqueName: \"kubernetes.io/projected/bc7a7cdc-3686-4af9-89ab-fea81132767c-kube-api-access-kfklp\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.100744 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-lfms6" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.101586 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564156-8bzbz" event={"ID":"bc7a7cdc-3686-4af9-89ab-fea81132767c","Type":"ContainerDied","Data":"3c26c09e13a86705739ce56c820d35bdaecd46d91996cf9b9eb6a05f04ebdf79"} Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.101633 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c26c09e13a86705739ce56c820d35bdaecd46d91996cf9b9eb6a05f04ebdf79" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.101692 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564156-8bzbz" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.112069 4696 generic.go:334] "Generic (PLEG): container finished" podID="a6d2b900-71d4-4dfd-bfda-07d44f39ee48" containerID="d19778f8475cb901076d1c303b9beb3e0d8935535e7ba653d9a58ea6b63a2908" exitCode=0 Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.112458 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564150-lks4l"] Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.112494 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-effc-account-create-update-phf9v" event={"ID":"a6d2b900-71d4-4dfd-bfda-07d44f39ee48","Type":"ContainerDied","Data":"d19778f8475cb901076d1c303b9beb3e0d8935535e7ba653d9a58ea6b63a2908"} Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.112934 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-effc-account-create-update-phf9v" event={"ID":"a6d2b900-71d4-4dfd-bfda-07d44f39ee48","Type":"ContainerStarted","Data":"6ff4701ba1206b637da34358d405df84e33e2356ad21b93e2b4052854ca23586"} Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.120919 4696 generic.go:334] "Generic (PLEG): container finished" podID="5971f2ec-035d-482c-8d75-eb8af348a864" containerID="82b976fe11924f48a0159d2fada52277301e8e608e0d79a2bd68980aa9e8fbb4" exitCode=0 Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.121059 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-lfms6" event={"ID":"5971f2ec-035d-482c-8d75-eb8af348a864","Type":"ContainerDied","Data":"82b976fe11924f48a0159d2fada52277301e8e608e0d79a2bd68980aa9e8fbb4"} Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.121099 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-lfms6" event={"ID":"5971f2ec-035d-482c-8d75-eb8af348a864","Type":"ContainerDied","Data":"aa46b00b66724a8d86cbd42f1b2f96a0e41d24c0e3f1d095e90772f23dcb6e4a"} Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.121120 4696 scope.go:117] "RemoveContainer" containerID="82b976fe11924f48a0159d2fada52277301e8e608e0d79a2bd68980aa9e8fbb4" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.121421 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-lfms6" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.121410 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564150-lks4l"] Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.135347 4696 generic.go:334] "Generic (PLEG): container finished" podID="b0ad9bfa-558d-440b-9297-c145b93193c2" containerID="0d6c20723944e6fcaf0a2c7c5c4d724f650152b99555979b8c399aac608f67ab" exitCode=0 Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.135435 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d8de-account-create-update-r4hss" event={"ID":"b0ad9bfa-558d-440b-9297-c145b93193c2","Type":"ContainerDied","Data":"0d6c20723944e6fcaf0a2c7c5c4d724f650152b99555979b8c399aac608f67ab"} Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.147164 4696 generic.go:334] "Generic (PLEG): container finished" podID="8a302450-56dc-4388-8796-f657954f0e25" containerID="a5995040a71b2e41a26358367386a0bcac58f1a210932ca456db70d802245f37" exitCode=0 Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.147294 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-th84q" event={"ID":"8a302450-56dc-4388-8796-f657954f0e25","Type":"ContainerDied","Data":"a5995040a71b2e41a26358367386a0bcac58f1a210932ca456db70d802245f37"} Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.163461 4696 scope.go:117] "RemoveContainer" containerID="5c2930df67b3e34ec223b0a8a471f19e85aaf85fcb7cf5935b3f1230a847e711" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.163954 4696 generic.go:334] "Generic (PLEG): container finished" podID="90c15c3a-cbbf-4ebf-b594-1782495f18db" containerID="3dadb71be7e3477427169d4d6cf08f73bc5befa344475f97d547fa61533071d3" exitCode=0 Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.164486 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jgf82" event={"ID":"90c15c3a-cbbf-4ebf-b594-1782495f18db","Type":"ContainerDied","Data":"3dadb71be7e3477427169d4d6cf08f73bc5befa344475f97d547fa61533071d3"} Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.205918 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6xh7\" (UniqueName: \"kubernetes.io/projected/5971f2ec-035d-482c-8d75-eb8af348a864-kube-api-access-b6xh7\") pod \"5971f2ec-035d-482c-8d75-eb8af348a864\" (UID: \"5971f2ec-035d-482c-8d75-eb8af348a864\") " Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.205995 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5971f2ec-035d-482c-8d75-eb8af348a864-ovsdbserver-sb\") pod \"5971f2ec-035d-482c-8d75-eb8af348a864\" (UID: \"5971f2ec-035d-482c-8d75-eb8af348a864\") " Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.206036 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5971f2ec-035d-482c-8d75-eb8af348a864-dns-svc\") pod \"5971f2ec-035d-482c-8d75-eb8af348a864\" (UID: \"5971f2ec-035d-482c-8d75-eb8af348a864\") " Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.206261 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5971f2ec-035d-482c-8d75-eb8af348a864-config\") pod \"5971f2ec-035d-482c-8d75-eb8af348a864\" (UID: \"5971f2ec-035d-482c-8d75-eb8af348a864\") " Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.206290 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5971f2ec-035d-482c-8d75-eb8af348a864-ovsdbserver-nb\") pod \"5971f2ec-035d-482c-8d75-eb8af348a864\" (UID: \"5971f2ec-035d-482c-8d75-eb8af348a864\") " Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.218976 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5971f2ec-035d-482c-8d75-eb8af348a864-kube-api-access-b6xh7" (OuterVolumeSpecName: "kube-api-access-b6xh7") pod "5971f2ec-035d-482c-8d75-eb8af348a864" (UID: "5971f2ec-035d-482c-8d75-eb8af348a864"). InnerVolumeSpecName "kube-api-access-b6xh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.247720 4696 scope.go:117] "RemoveContainer" containerID="82b976fe11924f48a0159d2fada52277301e8e608e0d79a2bd68980aa9e8fbb4" Mar 18 15:56:08 crc kubenswrapper[4696]: E0318 15:56:08.248406 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82b976fe11924f48a0159d2fada52277301e8e608e0d79a2bd68980aa9e8fbb4\": container with ID starting with 82b976fe11924f48a0159d2fada52277301e8e608e0d79a2bd68980aa9e8fbb4 not found: ID does not exist" containerID="82b976fe11924f48a0159d2fada52277301e8e608e0d79a2bd68980aa9e8fbb4" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.248453 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82b976fe11924f48a0159d2fada52277301e8e608e0d79a2bd68980aa9e8fbb4"} err="failed to get container status \"82b976fe11924f48a0159d2fada52277301e8e608e0d79a2bd68980aa9e8fbb4\": rpc error: code = NotFound desc = could not find container \"82b976fe11924f48a0159d2fada52277301e8e608e0d79a2bd68980aa9e8fbb4\": container with ID starting with 82b976fe11924f48a0159d2fada52277301e8e608e0d79a2bd68980aa9e8fbb4 not found: ID does not exist" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.248480 4696 scope.go:117] "RemoveContainer" containerID="5c2930df67b3e34ec223b0a8a471f19e85aaf85fcb7cf5935b3f1230a847e711" Mar 18 15:56:08 crc kubenswrapper[4696]: E0318 15:56:08.249835 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c2930df67b3e34ec223b0a8a471f19e85aaf85fcb7cf5935b3f1230a847e711\": container with ID starting with 5c2930df67b3e34ec223b0a8a471f19e85aaf85fcb7cf5935b3f1230a847e711 not found: ID does not exist" containerID="5c2930df67b3e34ec223b0a8a471f19e85aaf85fcb7cf5935b3f1230a847e711" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.249867 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c2930df67b3e34ec223b0a8a471f19e85aaf85fcb7cf5935b3f1230a847e711"} err="failed to get container status \"5c2930df67b3e34ec223b0a8a471f19e85aaf85fcb7cf5935b3f1230a847e711\": rpc error: code = NotFound desc = could not find container \"5c2930df67b3e34ec223b0a8a471f19e85aaf85fcb7cf5935b3f1230a847e711\": container with ID starting with 5c2930df67b3e34ec223b0a8a471f19e85aaf85fcb7cf5935b3f1230a847e711 not found: ID does not exist" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.277891 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5971f2ec-035d-482c-8d75-eb8af348a864-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5971f2ec-035d-482c-8d75-eb8af348a864" (UID: "5971f2ec-035d-482c-8d75-eb8af348a864"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.290028 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5971f2ec-035d-482c-8d75-eb8af348a864-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5971f2ec-035d-482c-8d75-eb8af348a864" (UID: "5971f2ec-035d-482c-8d75-eb8af348a864"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.294340 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5971f2ec-035d-482c-8d75-eb8af348a864-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5971f2ec-035d-482c-8d75-eb8af348a864" (UID: "5971f2ec-035d-482c-8d75-eb8af348a864"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.309101 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6xh7\" (UniqueName: \"kubernetes.io/projected/5971f2ec-035d-482c-8d75-eb8af348a864-kube-api-access-b6xh7\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.309125 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5971f2ec-035d-482c-8d75-eb8af348a864-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.309134 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5971f2ec-035d-482c-8d75-eb8af348a864-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.309142 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5971f2ec-035d-482c-8d75-eb8af348a864-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.315437 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5971f2ec-035d-482c-8d75-eb8af348a864-config" (OuterVolumeSpecName: "config") pod "5971f2ec-035d-482c-8d75-eb8af348a864" (UID: "5971f2ec-035d-482c-8d75-eb8af348a864"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.411017 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5971f2ec-035d-482c-8d75-eb8af348a864-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.491330 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-lfms6"] Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.496701 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-lfms6"] Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.716965 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-161d-account-create-update-cw9fd" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.761315 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-f8nj6" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.819268 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm94s\" (UniqueName: \"kubernetes.io/projected/4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7-kube-api-access-tm94s\") pod \"4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7\" (UID: \"4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7\") " Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.819430 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7-operator-scripts\") pod \"4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7\" (UID: \"4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7\") " Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.819537 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4btbj\" (UniqueName: \"kubernetes.io/projected/2662d61f-9289-4c66-8823-4bb09d86dd75-kube-api-access-4btbj\") pod \"2662d61f-9289-4c66-8823-4bb09d86dd75\" (UID: \"2662d61f-9289-4c66-8823-4bb09d86dd75\") " Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.819585 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2662d61f-9289-4c66-8823-4bb09d86dd75-operator-scripts\") pod \"2662d61f-9289-4c66-8823-4bb09d86dd75\" (UID: \"2662d61f-9289-4c66-8823-4bb09d86dd75\") " Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.820111 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7" (UID: "4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.820199 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2662d61f-9289-4c66-8823-4bb09d86dd75-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2662d61f-9289-4c66-8823-4bb09d86dd75" (UID: "2662d61f-9289-4c66-8823-4bb09d86dd75"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.828731 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2662d61f-9289-4c66-8823-4bb09d86dd75-kube-api-access-4btbj" (OuterVolumeSpecName: "kube-api-access-4btbj") pod "2662d61f-9289-4c66-8823-4bb09d86dd75" (UID: "2662d61f-9289-4c66-8823-4bb09d86dd75"). InnerVolumeSpecName "kube-api-access-4btbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.828843 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7-kube-api-access-tm94s" (OuterVolumeSpecName: "kube-api-access-tm94s") pod "4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7" (UID: "4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7"). InnerVolumeSpecName "kube-api-access-tm94s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.922148 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm94s\" (UniqueName: \"kubernetes.io/projected/4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7-kube-api-access-tm94s\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.922460 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.922558 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4btbj\" (UniqueName: \"kubernetes.io/projected/2662d61f-9289-4c66-8823-4bb09d86dd75-kube-api-access-4btbj\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:08 crc kubenswrapper[4696]: I0318 15:56:08.922636 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2662d61f-9289-4c66-8823-4bb09d86dd75-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.173772 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-161d-account-create-update-cw9fd" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.173768 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-161d-account-create-update-cw9fd" event={"ID":"2662d61f-9289-4c66-8823-4bb09d86dd75","Type":"ContainerDied","Data":"cbd7576e5c32160e36fbd41b68ba1b6d3b8aaa3f4f5ad13e3aa1fe8a0156e486"} Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.174293 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbd7576e5c32160e36fbd41b68ba1b6d3b8aaa3f4f5ad13e3aa1fe8a0156e486" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.175163 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-f8nj6" event={"ID":"4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7","Type":"ContainerDied","Data":"752d2dfa3755464d14245d8afa4a6290d95493b4538c2476073c8821f47ad4c8"} Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.175195 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="752d2dfa3755464d14245d8afa4a6290d95493b4538c2476073c8821f47ad4c8" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.175276 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-f8nj6" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.587234 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-effc-account-create-update-phf9v" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.620142 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b6ad1a0-39d2-4efa-b976-6879406394d3" path="/var/lib/kubelet/pods/4b6ad1a0-39d2-4efa-b976-6879406394d3/volumes" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.621062 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5971f2ec-035d-482c-8d75-eb8af348a864" path="/var/lib/kubelet/pods/5971f2ec-035d-482c-8d75-eb8af348a864/volumes" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.645003 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sjvx\" (UniqueName: \"kubernetes.io/projected/a6d2b900-71d4-4dfd-bfda-07d44f39ee48-kube-api-access-7sjvx\") pod \"a6d2b900-71d4-4dfd-bfda-07d44f39ee48\" (UID: \"a6d2b900-71d4-4dfd-bfda-07d44f39ee48\") " Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.645172 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6d2b900-71d4-4dfd-bfda-07d44f39ee48-operator-scripts\") pod \"a6d2b900-71d4-4dfd-bfda-07d44f39ee48\" (UID: \"a6d2b900-71d4-4dfd-bfda-07d44f39ee48\") " Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.653208 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d2b900-71d4-4dfd-bfda-07d44f39ee48-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a6d2b900-71d4-4dfd-bfda-07d44f39ee48" (UID: "a6d2b900-71d4-4dfd-bfda-07d44f39ee48"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.667942 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6d2b900-71d4-4dfd-bfda-07d44f39ee48-kube-api-access-7sjvx" (OuterVolumeSpecName: "kube-api-access-7sjvx") pod "a6d2b900-71d4-4dfd-bfda-07d44f39ee48" (UID: "a6d2b900-71d4-4dfd-bfda-07d44f39ee48"). InnerVolumeSpecName "kube-api-access-7sjvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.670638 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sjvx\" (UniqueName: \"kubernetes.io/projected/a6d2b900-71d4-4dfd-bfda-07d44f39ee48-kube-api-access-7sjvx\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.670679 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6d2b900-71d4-4dfd-bfda-07d44f39ee48-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.778862 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-th84q" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.787533 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jgf82" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.790835 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d8de-account-create-update-r4hss" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.874460 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0ad9bfa-558d-440b-9297-c145b93193c2-operator-scripts\") pod \"b0ad9bfa-558d-440b-9297-c145b93193c2\" (UID: \"b0ad9bfa-558d-440b-9297-c145b93193c2\") " Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.874599 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90c15c3a-cbbf-4ebf-b594-1782495f18db-operator-scripts\") pod \"90c15c3a-cbbf-4ebf-b594-1782495f18db\" (UID: \"90c15c3a-cbbf-4ebf-b594-1782495f18db\") " Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.874629 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wdf9\" (UniqueName: \"kubernetes.io/projected/90c15c3a-cbbf-4ebf-b594-1782495f18db-kube-api-access-9wdf9\") pod \"90c15c3a-cbbf-4ebf-b594-1782495f18db\" (UID: \"90c15c3a-cbbf-4ebf-b594-1782495f18db\") " Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.874724 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfsj4\" (UniqueName: \"kubernetes.io/projected/8a302450-56dc-4388-8796-f657954f0e25-kube-api-access-xfsj4\") pod \"8a302450-56dc-4388-8796-f657954f0e25\" (UID: \"8a302450-56dc-4388-8796-f657954f0e25\") " Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.874803 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6m4qh\" (UniqueName: \"kubernetes.io/projected/b0ad9bfa-558d-440b-9297-c145b93193c2-kube-api-access-6m4qh\") pod \"b0ad9bfa-558d-440b-9297-c145b93193c2\" (UID: \"b0ad9bfa-558d-440b-9297-c145b93193c2\") " Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.874867 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a302450-56dc-4388-8796-f657954f0e25-operator-scripts\") pod \"8a302450-56dc-4388-8796-f657954f0e25\" (UID: \"8a302450-56dc-4388-8796-f657954f0e25\") " Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.877854 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a302450-56dc-4388-8796-f657954f0e25-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8a302450-56dc-4388-8796-f657954f0e25" (UID: "8a302450-56dc-4388-8796-f657954f0e25"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.878273 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0ad9bfa-558d-440b-9297-c145b93193c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b0ad9bfa-558d-440b-9297-c145b93193c2" (UID: "b0ad9bfa-558d-440b-9297-c145b93193c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.878506 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90c15c3a-cbbf-4ebf-b594-1782495f18db-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "90c15c3a-cbbf-4ebf-b594-1782495f18db" (UID: "90c15c3a-cbbf-4ebf-b594-1782495f18db"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.882065 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90c15c3a-cbbf-4ebf-b594-1782495f18db-kube-api-access-9wdf9" (OuterVolumeSpecName: "kube-api-access-9wdf9") pod "90c15c3a-cbbf-4ebf-b594-1782495f18db" (UID: "90c15c3a-cbbf-4ebf-b594-1782495f18db"). InnerVolumeSpecName "kube-api-access-9wdf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.882551 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a302450-56dc-4388-8796-f657954f0e25-kube-api-access-xfsj4" (OuterVolumeSpecName: "kube-api-access-xfsj4") pod "8a302450-56dc-4388-8796-f657954f0e25" (UID: "8a302450-56dc-4388-8796-f657954f0e25"). InnerVolumeSpecName "kube-api-access-xfsj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.894918 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0ad9bfa-558d-440b-9297-c145b93193c2-kube-api-access-6m4qh" (OuterVolumeSpecName: "kube-api-access-6m4qh") pod "b0ad9bfa-558d-440b-9297-c145b93193c2" (UID: "b0ad9bfa-558d-440b-9297-c145b93193c2"). InnerVolumeSpecName "kube-api-access-6m4qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.977033 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfsj4\" (UniqueName: \"kubernetes.io/projected/8a302450-56dc-4388-8796-f657954f0e25-kube-api-access-xfsj4\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.977072 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6m4qh\" (UniqueName: \"kubernetes.io/projected/b0ad9bfa-558d-440b-9297-c145b93193c2-kube-api-access-6m4qh\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.977082 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a302450-56dc-4388-8796-f657954f0e25-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.977091 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0ad9bfa-558d-440b-9297-c145b93193c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.977101 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90c15c3a-cbbf-4ebf-b594-1782495f18db-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:09 crc kubenswrapper[4696]: I0318 15:56:09.977109 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wdf9\" (UniqueName: \"kubernetes.io/projected/90c15c3a-cbbf-4ebf-b594-1782495f18db-kube-api-access-9wdf9\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.167173 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-sfb9t"] Mar 18 15:56:10 crc kubenswrapper[4696]: E0318 15:56:10.167748 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ad9bfa-558d-440b-9297-c145b93193c2" containerName="mariadb-account-create-update" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.167779 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ad9bfa-558d-440b-9297-c145b93193c2" containerName="mariadb-account-create-update" Mar 18 15:56:10 crc kubenswrapper[4696]: E0318 15:56:10.167796 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc7a7cdc-3686-4af9-89ab-fea81132767c" containerName="oc" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.167805 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc7a7cdc-3686-4af9-89ab-fea81132767c" containerName="oc" Mar 18 15:56:10 crc kubenswrapper[4696]: E0318 15:56:10.167838 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a302450-56dc-4388-8796-f657954f0e25" containerName="mariadb-database-create" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.167847 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a302450-56dc-4388-8796-f657954f0e25" containerName="mariadb-database-create" Mar 18 15:56:10 crc kubenswrapper[4696]: E0318 15:56:10.167861 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5971f2ec-035d-482c-8d75-eb8af348a864" containerName="init" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.167871 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="5971f2ec-035d-482c-8d75-eb8af348a864" containerName="init" Mar 18 15:56:10 crc kubenswrapper[4696]: E0318 15:56:10.167889 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90c15c3a-cbbf-4ebf-b594-1782495f18db" containerName="mariadb-database-create" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.167896 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="90c15c3a-cbbf-4ebf-b594-1782495f18db" containerName="mariadb-database-create" Mar 18 15:56:10 crc kubenswrapper[4696]: E0318 15:56:10.167912 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2662d61f-9289-4c66-8823-4bb09d86dd75" containerName="mariadb-account-create-update" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.167919 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="2662d61f-9289-4c66-8823-4bb09d86dd75" containerName="mariadb-account-create-update" Mar 18 15:56:10 crc kubenswrapper[4696]: E0318 15:56:10.167932 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6d2b900-71d4-4dfd-bfda-07d44f39ee48" containerName="mariadb-account-create-update" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.167939 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d2b900-71d4-4dfd-bfda-07d44f39ee48" containerName="mariadb-account-create-update" Mar 18 15:56:10 crc kubenswrapper[4696]: E0318 15:56:10.167957 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7" containerName="mariadb-database-create" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.167967 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7" containerName="mariadb-database-create" Mar 18 15:56:10 crc kubenswrapper[4696]: E0318 15:56:10.167980 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5971f2ec-035d-482c-8d75-eb8af348a864" containerName="dnsmasq-dns" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.167994 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="5971f2ec-035d-482c-8d75-eb8af348a864" containerName="dnsmasq-dns" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.168211 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc7a7cdc-3686-4af9-89ab-fea81132767c" containerName="oc" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.168232 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a302450-56dc-4388-8796-f657954f0e25" containerName="mariadb-database-create" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.168240 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="2662d61f-9289-4c66-8823-4bb09d86dd75" containerName="mariadb-account-create-update" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.168253 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="90c15c3a-cbbf-4ebf-b594-1782495f18db" containerName="mariadb-database-create" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.168262 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0ad9bfa-558d-440b-9297-c145b93193c2" containerName="mariadb-account-create-update" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.168271 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6d2b900-71d4-4dfd-bfda-07d44f39ee48" containerName="mariadb-account-create-update" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.168282 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="5971f2ec-035d-482c-8d75-eb8af348a864" containerName="dnsmasq-dns" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.168299 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7" containerName="mariadb-database-create" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.169203 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sfb9t" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.172054 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.172376 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6rg96" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.187299 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-sfb9t"] Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.188382 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jgf82" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.188366 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jgf82" event={"ID":"90c15c3a-cbbf-4ebf-b594-1782495f18db","Type":"ContainerDied","Data":"d4d58efb65cb8998709a88b6c0cdf2d27fbd2abc8633e21c40ed35a158cd783b"} Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.188571 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4d58efb65cb8998709a88b6c0cdf2d27fbd2abc8633e21c40ed35a158cd783b" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.191069 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d8de-account-create-update-r4hss" event={"ID":"b0ad9bfa-558d-440b-9297-c145b93193c2","Type":"ContainerDied","Data":"bf87a6398c5ddeece4762b8079703389237802059cda6b3fc1be1335588fa463"} Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.191129 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf87a6398c5ddeece4762b8079703389237802059cda6b3fc1be1335588fa463" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.191226 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d8de-account-create-update-r4hss" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.197354 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-th84q" event={"ID":"8a302450-56dc-4388-8796-f657954f0e25","Type":"ContainerDied","Data":"1b45c19660cbc92a2487eae831b564b505c260d741ae18a1c2af3c519e7f1632"} Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.197412 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b45c19660cbc92a2487eae831b564b505c260d741ae18a1c2af3c519e7f1632" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.197498 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-th84q" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.203411 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-effc-account-create-update-phf9v" event={"ID":"a6d2b900-71d4-4dfd-bfda-07d44f39ee48","Type":"ContainerDied","Data":"6ff4701ba1206b637da34358d405df84e33e2356ad21b93e2b4052854ca23586"} Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.203480 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ff4701ba1206b637da34358d405df84e33e2356ad21b93e2b4052854ca23586" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.203760 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-effc-account-create-update-phf9v" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.283203 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6cfc5851-8295-4c4d-8cb4-4c18f9827227-db-sync-config-data\") pod \"glance-db-sync-sfb9t\" (UID: \"6cfc5851-8295-4c4d-8cb4-4c18f9827227\") " pod="openstack/glance-db-sync-sfb9t" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.283311 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sp7m\" (UniqueName: \"kubernetes.io/projected/6cfc5851-8295-4c4d-8cb4-4c18f9827227-kube-api-access-8sp7m\") pod \"glance-db-sync-sfb9t\" (UID: \"6cfc5851-8295-4c4d-8cb4-4c18f9827227\") " pod="openstack/glance-db-sync-sfb9t" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.283504 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cfc5851-8295-4c4d-8cb4-4c18f9827227-combined-ca-bundle\") pod \"glance-db-sync-sfb9t\" (UID: \"6cfc5851-8295-4c4d-8cb4-4c18f9827227\") " pod="openstack/glance-db-sync-sfb9t" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.283561 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cfc5851-8295-4c4d-8cb4-4c18f9827227-config-data\") pod \"glance-db-sync-sfb9t\" (UID: \"6cfc5851-8295-4c4d-8cb4-4c18f9827227\") " pod="openstack/glance-db-sync-sfb9t" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.384767 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sp7m\" (UniqueName: \"kubernetes.io/projected/6cfc5851-8295-4c4d-8cb4-4c18f9827227-kube-api-access-8sp7m\") pod \"glance-db-sync-sfb9t\" (UID: \"6cfc5851-8295-4c4d-8cb4-4c18f9827227\") " pod="openstack/glance-db-sync-sfb9t" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.384951 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cfc5851-8295-4c4d-8cb4-4c18f9827227-config-data\") pod \"glance-db-sync-sfb9t\" (UID: \"6cfc5851-8295-4c4d-8cb4-4c18f9827227\") " pod="openstack/glance-db-sync-sfb9t" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.384978 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cfc5851-8295-4c4d-8cb4-4c18f9827227-combined-ca-bundle\") pod \"glance-db-sync-sfb9t\" (UID: \"6cfc5851-8295-4c4d-8cb4-4c18f9827227\") " pod="openstack/glance-db-sync-sfb9t" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.385056 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6cfc5851-8295-4c4d-8cb4-4c18f9827227-db-sync-config-data\") pod \"glance-db-sync-sfb9t\" (UID: \"6cfc5851-8295-4c4d-8cb4-4c18f9827227\") " pod="openstack/glance-db-sync-sfb9t" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.388995 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cfc5851-8295-4c4d-8cb4-4c18f9827227-combined-ca-bundle\") pod \"glance-db-sync-sfb9t\" (UID: \"6cfc5851-8295-4c4d-8cb4-4c18f9827227\") " pod="openstack/glance-db-sync-sfb9t" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.389770 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cfc5851-8295-4c4d-8cb4-4c18f9827227-config-data\") pod \"glance-db-sync-sfb9t\" (UID: \"6cfc5851-8295-4c4d-8cb4-4c18f9827227\") " pod="openstack/glance-db-sync-sfb9t" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.390087 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6cfc5851-8295-4c4d-8cb4-4c18f9827227-db-sync-config-data\") pod \"glance-db-sync-sfb9t\" (UID: \"6cfc5851-8295-4c4d-8cb4-4c18f9827227\") " pod="openstack/glance-db-sync-sfb9t" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.408519 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sp7m\" (UniqueName: \"kubernetes.io/projected/6cfc5851-8295-4c4d-8cb4-4c18f9827227-kube-api-access-8sp7m\") pod \"glance-db-sync-sfb9t\" (UID: \"6cfc5851-8295-4c4d-8cb4-4c18f9827227\") " pod="openstack/glance-db-sync-sfb9t" Mar 18 15:56:10 crc kubenswrapper[4696]: I0318 15:56:10.487156 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sfb9t" Mar 18 15:56:11 crc kubenswrapper[4696]: I0318 15:56:11.098198 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-sfb9t"] Mar 18 15:56:11 crc kubenswrapper[4696]: I0318 15:56:11.212107 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sfb9t" event={"ID":"6cfc5851-8295-4c4d-8cb4-4c18f9827227","Type":"ContainerStarted","Data":"a1e76f83baf6200059f7cdf8d141771e8dd3f3758566342c6298d8d5b7f2d3ce"} Mar 18 15:56:11 crc kubenswrapper[4696]: I0318 15:56:11.890514 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-stdjz"] Mar 18 15:56:11 crc kubenswrapper[4696]: I0318 15:56:11.891657 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-stdjz" Mar 18 15:56:11 crc kubenswrapper[4696]: I0318 15:56:11.895268 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 15:56:11 crc kubenswrapper[4696]: I0318 15:56:11.902097 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-stdjz"] Mar 18 15:56:12 crc kubenswrapper[4696]: I0318 15:56:12.015555 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f810c54-e552-4322-b668-639f58cab1d1-operator-scripts\") pod \"root-account-create-update-stdjz\" (UID: \"8f810c54-e552-4322-b668-639f58cab1d1\") " pod="openstack/root-account-create-update-stdjz" Mar 18 15:56:12 crc kubenswrapper[4696]: I0318 15:56:12.015629 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxdsk\" (UniqueName: \"kubernetes.io/projected/8f810c54-e552-4322-b668-639f58cab1d1-kube-api-access-fxdsk\") pod \"root-account-create-update-stdjz\" (UID: \"8f810c54-e552-4322-b668-639f58cab1d1\") " pod="openstack/root-account-create-update-stdjz" Mar 18 15:56:12 crc kubenswrapper[4696]: I0318 15:56:12.117731 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f810c54-e552-4322-b668-639f58cab1d1-operator-scripts\") pod \"root-account-create-update-stdjz\" (UID: \"8f810c54-e552-4322-b668-639f58cab1d1\") " pod="openstack/root-account-create-update-stdjz" Mar 18 15:56:12 crc kubenswrapper[4696]: I0318 15:56:12.117808 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxdsk\" (UniqueName: \"kubernetes.io/projected/8f810c54-e552-4322-b668-639f58cab1d1-kube-api-access-fxdsk\") pod \"root-account-create-update-stdjz\" (UID: \"8f810c54-e552-4322-b668-639f58cab1d1\") " pod="openstack/root-account-create-update-stdjz" Mar 18 15:56:12 crc kubenswrapper[4696]: I0318 15:56:12.118931 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f810c54-e552-4322-b668-639f58cab1d1-operator-scripts\") pod \"root-account-create-update-stdjz\" (UID: \"8f810c54-e552-4322-b668-639f58cab1d1\") " pod="openstack/root-account-create-update-stdjz" Mar 18 15:56:12 crc kubenswrapper[4696]: I0318 15:56:12.139752 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxdsk\" (UniqueName: \"kubernetes.io/projected/8f810c54-e552-4322-b668-639f58cab1d1-kube-api-access-fxdsk\") pod \"root-account-create-update-stdjz\" (UID: \"8f810c54-e552-4322-b668-639f58cab1d1\") " pod="openstack/root-account-create-update-stdjz" Mar 18 15:56:12 crc kubenswrapper[4696]: I0318 15:56:12.184958 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:56:12 crc kubenswrapper[4696]: I0318 15:56:12.185053 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:56:12 crc kubenswrapper[4696]: I0318 15:56:12.185124 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 15:56:12 crc kubenswrapper[4696]: I0318 15:56:12.186115 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"859c999f0d34d60bd36ffb5138cc29b3e68983a12c55849818cd9411add0b7fd"} pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 15:56:12 crc kubenswrapper[4696]: I0318 15:56:12.186191 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" containerID="cri-o://859c999f0d34d60bd36ffb5138cc29b3e68983a12c55849818cd9411add0b7fd" gracePeriod=600 Mar 18 15:56:12 crc kubenswrapper[4696]: I0318 15:56:12.213800 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-stdjz" Mar 18 15:56:12 crc kubenswrapper[4696]: I0318 15:56:12.572210 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-stdjz"] Mar 18 15:56:13 crc kubenswrapper[4696]: I0318 15:56:13.235334 4696 generic.go:334] "Generic (PLEG): container finished" podID="8f810c54-e552-4322-b668-639f58cab1d1" containerID="1baf2ee679ae0ca62c028acde1076539a1c73dbbd112ff315de974a1afefd1cd" exitCode=0 Mar 18 15:56:13 crc kubenswrapper[4696]: I0318 15:56:13.235405 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-stdjz" event={"ID":"8f810c54-e552-4322-b668-639f58cab1d1","Type":"ContainerDied","Data":"1baf2ee679ae0ca62c028acde1076539a1c73dbbd112ff315de974a1afefd1cd"} Mar 18 15:56:13 crc kubenswrapper[4696]: I0318 15:56:13.235744 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-stdjz" event={"ID":"8f810c54-e552-4322-b668-639f58cab1d1","Type":"ContainerStarted","Data":"8c69fa472955dfdbcd91ac746f55361e3c92dcbb33e094c68e7f3cc3cde81dbe"} Mar 18 15:56:13 crc kubenswrapper[4696]: I0318 15:56:13.240616 4696 generic.go:334] "Generic (PLEG): container finished" podID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerID="859c999f0d34d60bd36ffb5138cc29b3e68983a12c55849818cd9411add0b7fd" exitCode=0 Mar 18 15:56:13 crc kubenswrapper[4696]: I0318 15:56:13.240658 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerDied","Data":"859c999f0d34d60bd36ffb5138cc29b3e68983a12c55849818cd9411add0b7fd"} Mar 18 15:56:13 crc kubenswrapper[4696]: I0318 15:56:13.240682 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerStarted","Data":"e7d6aeeccf3f0ce1fb7410ba06c6597ef8535cab4338e38adaf8fc42a5797086"} Mar 18 15:56:13 crc kubenswrapper[4696]: I0318 15:56:13.240705 4696 scope.go:117] "RemoveContainer" containerID="8dbceebb5bec41c37e5bdb742818ff7d79c094f0f97795f1ea326504fa9fafa5" Mar 18 15:56:14 crc kubenswrapper[4696]: I0318 15:56:14.164612 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/acfc5351-8c75-4362-8e66-b9ade04d74eb-etc-swift\") pod \"swift-storage-0\" (UID: \"acfc5351-8c75-4362-8e66-b9ade04d74eb\") " pod="openstack/swift-storage-0" Mar 18 15:56:14 crc kubenswrapper[4696]: I0318 15:56:14.207749 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/acfc5351-8c75-4362-8e66-b9ade04d74eb-etc-swift\") pod \"swift-storage-0\" (UID: \"acfc5351-8c75-4362-8e66-b9ade04d74eb\") " pod="openstack/swift-storage-0" Mar 18 15:56:14 crc kubenswrapper[4696]: I0318 15:56:14.251345 4696 generic.go:334] "Generic (PLEG): container finished" podID="62b4b14f-0ab3-4906-9c97-8c3092cd5379" containerID="8075b95a7f44c9f907657d8ab536470921854bc1d4768adb9f8f8686c5b4002c" exitCode=0 Mar 18 15:56:14 crc kubenswrapper[4696]: I0318 15:56:14.251422 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xj2ch" event={"ID":"62b4b14f-0ab3-4906-9c97-8c3092cd5379","Type":"ContainerDied","Data":"8075b95a7f44c9f907657d8ab536470921854bc1d4768adb9f8f8686c5b4002c"} Mar 18 15:56:14 crc kubenswrapper[4696]: I0318 15:56:14.258086 4696 generic.go:334] "Generic (PLEG): container finished" podID="9b5b880f-8efc-483f-b734-fa854ddd30dc" containerID="c0fd7f944f641aa39a971d00f949ab3c23bfbfb18ecce7eea052a2f01e079a00" exitCode=0 Mar 18 15:56:14 crc kubenswrapper[4696]: I0318 15:56:14.258306 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9b5b880f-8efc-483f-b734-fa854ddd30dc","Type":"ContainerDied","Data":"c0fd7f944f641aa39a971d00f949ab3c23bfbfb18ecce7eea052a2f01e079a00"} Mar 18 15:56:14 crc kubenswrapper[4696]: I0318 15:56:14.260154 4696 generic.go:334] "Generic (PLEG): container finished" podID="0bdb4167-8754-4c20-97ea-b014ce2cafdc" containerID="15558694b84554c2673508963769937e6f6ef6935177a9200417d184a7df8a9d" exitCode=0 Mar 18 15:56:14 crc kubenswrapper[4696]: I0318 15:56:14.260311 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0bdb4167-8754-4c20-97ea-b014ce2cafdc","Type":"ContainerDied","Data":"15558694b84554c2673508963769937e6f6ef6935177a9200417d184a7df8a9d"} Mar 18 15:56:14 crc kubenswrapper[4696]: I0318 15:56:14.424939 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 15:56:14 crc kubenswrapper[4696]: I0318 15:56:14.745747 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-stdjz" Mar 18 15:56:14 crc kubenswrapper[4696]: I0318 15:56:14.777581 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxdsk\" (UniqueName: \"kubernetes.io/projected/8f810c54-e552-4322-b668-639f58cab1d1-kube-api-access-fxdsk\") pod \"8f810c54-e552-4322-b668-639f58cab1d1\" (UID: \"8f810c54-e552-4322-b668-639f58cab1d1\") " Mar 18 15:56:14 crc kubenswrapper[4696]: I0318 15:56:14.777825 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f810c54-e552-4322-b668-639f58cab1d1-operator-scripts\") pod \"8f810c54-e552-4322-b668-639f58cab1d1\" (UID: \"8f810c54-e552-4322-b668-639f58cab1d1\") " Mar 18 15:56:14 crc kubenswrapper[4696]: I0318 15:56:14.778874 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f810c54-e552-4322-b668-639f58cab1d1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8f810c54-e552-4322-b668-639f58cab1d1" (UID: "8f810c54-e552-4322-b668-639f58cab1d1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:14 crc kubenswrapper[4696]: I0318 15:56:14.779987 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f810c54-e552-4322-b668-639f58cab1d1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:14 crc kubenswrapper[4696]: I0318 15:56:14.786792 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f810c54-e552-4322-b668-639f58cab1d1-kube-api-access-fxdsk" (OuterVolumeSpecName: "kube-api-access-fxdsk") pod "8f810c54-e552-4322-b668-639f58cab1d1" (UID: "8f810c54-e552-4322-b668-639f58cab1d1"). InnerVolumeSpecName "kube-api-access-fxdsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:14 crc kubenswrapper[4696]: I0318 15:56:14.885401 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxdsk\" (UniqueName: \"kubernetes.io/projected/8f810c54-e552-4322-b668-639f58cab1d1-kube-api-access-fxdsk\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.128934 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 18 15:56:15 crc kubenswrapper[4696]: W0318 15:56:15.142771 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacfc5351_8c75_4362_8e66_b9ade04d74eb.slice/crio-195d0b4d817a28e135e0ae43289e1f6f3228817156b270d4081a437efdcfc1e0 WatchSource:0}: Error finding container 195d0b4d817a28e135e0ae43289e1f6f3228817156b270d4081a437efdcfc1e0: Status 404 returned error can't find the container with id 195d0b4d817a28e135e0ae43289e1f6f3228817156b270d4081a437efdcfc1e0 Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.272304 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9b5b880f-8efc-483f-b734-fa854ddd30dc","Type":"ContainerStarted","Data":"763a47e190b3766069fc26ef6310bdc5c5beb355c682d3dd8ee7ed57e35b1c03"} Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.274452 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.276712 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-stdjz" Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.276853 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-stdjz" event={"ID":"8f810c54-e552-4322-b668-639f58cab1d1","Type":"ContainerDied","Data":"8c69fa472955dfdbcd91ac746f55361e3c92dcbb33e094c68e7f3cc3cde81dbe"} Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.276952 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c69fa472955dfdbcd91ac746f55361e3c92dcbb33e094c68e7f3cc3cde81dbe" Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.288623 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0bdb4167-8754-4c20-97ea-b014ce2cafdc","Type":"ContainerStarted","Data":"12649b52449583f0f3026135104b5af446bca7a18e2d1e387356eb235a32807b"} Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.289263 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.295774 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acfc5351-8c75-4362-8e66-b9ade04d74eb","Type":"ContainerStarted","Data":"195d0b4d817a28e135e0ae43289e1f6f3228817156b270d4081a437efdcfc1e0"} Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.318098 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=43.124285222 podStartE2EDuration="55.318077959s" podCreationTimestamp="2026-03-18 15:55:20 +0000 UTC" firstStartedPulling="2026-03-18 15:55:28.150202231 +0000 UTC m=+1171.156376427" lastFinishedPulling="2026-03-18 15:55:40.343994958 +0000 UTC m=+1183.350169164" observedRunningTime="2026-03-18 15:56:15.314266244 +0000 UTC m=+1218.320440450" watchObservedRunningTime="2026-03-18 15:56:15.318077959 +0000 UTC m=+1218.324252165" Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.355814 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=43.188321536 podStartE2EDuration="55.355791204s" podCreationTimestamp="2026-03-18 15:55:20 +0000 UTC" firstStartedPulling="2026-03-18 15:55:28.158111499 +0000 UTC m=+1171.164285705" lastFinishedPulling="2026-03-18 15:55:40.325581167 +0000 UTC m=+1183.331755373" observedRunningTime="2026-03-18 15:56:15.347880856 +0000 UTC m=+1218.354055072" watchObservedRunningTime="2026-03-18 15:56:15.355791204 +0000 UTC m=+1218.361965410" Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.726431 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.803397 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62b4b14f-0ab3-4906-9c97-8c3092cd5379-scripts\") pod \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.803834 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b4b14f-0ab3-4906-9c97-8c3092cd5379-combined-ca-bundle\") pod \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.803910 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/62b4b14f-0ab3-4906-9c97-8c3092cd5379-ring-data-devices\") pod \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.803936 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdfgb\" (UniqueName: \"kubernetes.io/projected/62b4b14f-0ab3-4906-9c97-8c3092cd5379-kube-api-access-rdfgb\") pod \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.804054 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/62b4b14f-0ab3-4906-9c97-8c3092cd5379-dispersionconf\") pod \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.804108 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/62b4b14f-0ab3-4906-9c97-8c3092cd5379-etc-swift\") pod \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.804133 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/62b4b14f-0ab3-4906-9c97-8c3092cd5379-swiftconf\") pod \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\" (UID: \"62b4b14f-0ab3-4906-9c97-8c3092cd5379\") " Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.804685 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62b4b14f-0ab3-4906-9c97-8c3092cd5379-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "62b4b14f-0ab3-4906-9c97-8c3092cd5379" (UID: "62b4b14f-0ab3-4906-9c97-8c3092cd5379"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.805198 4696 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/62b4b14f-0ab3-4906-9c97-8c3092cd5379-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.805413 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62b4b14f-0ab3-4906-9c97-8c3092cd5379-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "62b4b14f-0ab3-4906-9c97-8c3092cd5379" (UID: "62b4b14f-0ab3-4906-9c97-8c3092cd5379"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.827373 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62b4b14f-0ab3-4906-9c97-8c3092cd5379-kube-api-access-rdfgb" (OuterVolumeSpecName: "kube-api-access-rdfgb") pod "62b4b14f-0ab3-4906-9c97-8c3092cd5379" (UID: "62b4b14f-0ab3-4906-9c97-8c3092cd5379"). InnerVolumeSpecName "kube-api-access-rdfgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.831977 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62b4b14f-0ab3-4906-9c97-8c3092cd5379-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "62b4b14f-0ab3-4906-9c97-8c3092cd5379" (UID: "62b4b14f-0ab3-4906-9c97-8c3092cd5379"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.854935 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62b4b14f-0ab3-4906-9c97-8c3092cd5379-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62b4b14f-0ab3-4906-9c97-8c3092cd5379" (UID: "62b4b14f-0ab3-4906-9c97-8c3092cd5379"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.865871 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62b4b14f-0ab3-4906-9c97-8c3092cd5379-scripts" (OuterVolumeSpecName: "scripts") pod "62b4b14f-0ab3-4906-9c97-8c3092cd5379" (UID: "62b4b14f-0ab3-4906-9c97-8c3092cd5379"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.866692 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62b4b14f-0ab3-4906-9c97-8c3092cd5379-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "62b4b14f-0ab3-4906-9c97-8c3092cd5379" (UID: "62b4b14f-0ab3-4906-9c97-8c3092cd5379"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.908661 4696 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/62b4b14f-0ab3-4906-9c97-8c3092cd5379-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.908708 4696 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/62b4b14f-0ab3-4906-9c97-8c3092cd5379-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.908727 4696 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/62b4b14f-0ab3-4906-9c97-8c3092cd5379-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.908740 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62b4b14f-0ab3-4906-9c97-8c3092cd5379-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.914239 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b4b14f-0ab3-4906-9c97-8c3092cd5379-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:15 crc kubenswrapper[4696]: I0318 15:56:15.914279 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdfgb\" (UniqueName: \"kubernetes.io/projected/62b4b14f-0ab3-4906-9c97-8c3092cd5379-kube-api-access-rdfgb\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:16 crc kubenswrapper[4696]: I0318 15:56:16.307209 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xj2ch" Mar 18 15:56:16 crc kubenswrapper[4696]: I0318 15:56:16.308746 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xj2ch" event={"ID":"62b4b14f-0ab3-4906-9c97-8c3092cd5379","Type":"ContainerDied","Data":"422eeff88d38f4b351db4c5bc6417ba3cbe7238caabee37ffa619764c6cdff29"} Mar 18 15:56:16 crc kubenswrapper[4696]: I0318 15:56:16.308818 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="422eeff88d38f4b351db4c5bc6417ba3cbe7238caabee37ffa619764c6cdff29" Mar 18 15:56:18 crc kubenswrapper[4696]: I0318 15:56:18.158337 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-stdjz"] Mar 18 15:56:18 crc kubenswrapper[4696]: I0318 15:56:18.173243 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-stdjz"] Mar 18 15:56:18 crc kubenswrapper[4696]: I0318 15:56:18.567511 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 18 15:56:19 crc kubenswrapper[4696]: I0318 15:56:19.344577 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acfc5351-8c75-4362-8e66-b9ade04d74eb","Type":"ContainerStarted","Data":"df78c9ba13e5fa3fc95f22c6747b50415f6f670b83f89e4c2550fb208dd4da9c"} Mar 18 15:56:19 crc kubenswrapper[4696]: I0318 15:56:19.608173 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f810c54-e552-4322-b668-639f58cab1d1" path="/var/lib/kubelet/pods/8f810c54-e552-4322-b668-639f58cab1d1/volumes" Mar 18 15:56:21 crc kubenswrapper[4696]: I0318 15:56:21.743809 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vb7xn" podUID="efa7f696-eda9-4cd4-953b-0a24e9935290" containerName="ovn-controller" probeResult="failure" output=< Mar 18 15:56:21 crc kubenswrapper[4696]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 15:56:21 crc kubenswrapper[4696]: > Mar 18 15:56:21 crc kubenswrapper[4696]: I0318 15:56:21.761069 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-x4bkz" Mar 18 15:56:23 crc kubenswrapper[4696]: I0318 15:56:23.166608 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-g666r"] Mar 18 15:56:23 crc kubenswrapper[4696]: E0318 15:56:23.167343 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62b4b14f-0ab3-4906-9c97-8c3092cd5379" containerName="swift-ring-rebalance" Mar 18 15:56:23 crc kubenswrapper[4696]: I0318 15:56:23.167357 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="62b4b14f-0ab3-4906-9c97-8c3092cd5379" containerName="swift-ring-rebalance" Mar 18 15:56:23 crc kubenswrapper[4696]: E0318 15:56:23.167369 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f810c54-e552-4322-b668-639f58cab1d1" containerName="mariadb-account-create-update" Mar 18 15:56:23 crc kubenswrapper[4696]: I0318 15:56:23.167376 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f810c54-e552-4322-b668-639f58cab1d1" containerName="mariadb-account-create-update" Mar 18 15:56:23 crc kubenswrapper[4696]: I0318 15:56:23.167539 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f810c54-e552-4322-b668-639f58cab1d1" containerName="mariadb-account-create-update" Mar 18 15:56:23 crc kubenswrapper[4696]: I0318 15:56:23.167559 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="62b4b14f-0ab3-4906-9c97-8c3092cd5379" containerName="swift-ring-rebalance" Mar 18 15:56:23 crc kubenswrapper[4696]: I0318 15:56:23.168060 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-g666r" Mar 18 15:56:23 crc kubenswrapper[4696]: I0318 15:56:23.171301 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 15:56:23 crc kubenswrapper[4696]: I0318 15:56:23.182094 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-g666r"] Mar 18 15:56:23 crc kubenswrapper[4696]: I0318 15:56:23.361824 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78ecc605-5dea-4131-9161-7adf4ba2db45-operator-scripts\") pod \"root-account-create-update-g666r\" (UID: \"78ecc605-5dea-4131-9161-7adf4ba2db45\") " pod="openstack/root-account-create-update-g666r" Mar 18 15:56:23 crc kubenswrapper[4696]: I0318 15:56:23.361961 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnqdz\" (UniqueName: \"kubernetes.io/projected/78ecc605-5dea-4131-9161-7adf4ba2db45-kube-api-access-lnqdz\") pod \"root-account-create-update-g666r\" (UID: \"78ecc605-5dea-4131-9161-7adf4ba2db45\") " pod="openstack/root-account-create-update-g666r" Mar 18 15:56:23 crc kubenswrapper[4696]: I0318 15:56:23.465184 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78ecc605-5dea-4131-9161-7adf4ba2db45-operator-scripts\") pod \"root-account-create-update-g666r\" (UID: \"78ecc605-5dea-4131-9161-7adf4ba2db45\") " pod="openstack/root-account-create-update-g666r" Mar 18 15:56:23 crc kubenswrapper[4696]: I0318 15:56:23.465384 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnqdz\" (UniqueName: \"kubernetes.io/projected/78ecc605-5dea-4131-9161-7adf4ba2db45-kube-api-access-lnqdz\") pod \"root-account-create-update-g666r\" (UID: \"78ecc605-5dea-4131-9161-7adf4ba2db45\") " pod="openstack/root-account-create-update-g666r" Mar 18 15:56:23 crc kubenswrapper[4696]: I0318 15:56:23.466096 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78ecc605-5dea-4131-9161-7adf4ba2db45-operator-scripts\") pod \"root-account-create-update-g666r\" (UID: \"78ecc605-5dea-4131-9161-7adf4ba2db45\") " pod="openstack/root-account-create-update-g666r" Mar 18 15:56:23 crc kubenswrapper[4696]: I0318 15:56:23.501888 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnqdz\" (UniqueName: \"kubernetes.io/projected/78ecc605-5dea-4131-9161-7adf4ba2db45-kube-api-access-lnqdz\") pod \"root-account-create-update-g666r\" (UID: \"78ecc605-5dea-4131-9161-7adf4ba2db45\") " pod="openstack/root-account-create-update-g666r" Mar 18 15:56:23 crc kubenswrapper[4696]: I0318 15:56:23.793643 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-g666r" Mar 18 15:56:26 crc kubenswrapper[4696]: I0318 15:56:26.721115 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vb7xn" podUID="efa7f696-eda9-4cd4-953b-0a24e9935290" containerName="ovn-controller" probeResult="failure" output=< Mar 18 15:56:26 crc kubenswrapper[4696]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 15:56:26 crc kubenswrapper[4696]: > Mar 18 15:56:26 crc kubenswrapper[4696]: I0318 15:56:26.745455 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-x4bkz" Mar 18 15:56:26 crc kubenswrapper[4696]: I0318 15:56:26.961739 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vb7xn-config-vjwc8"] Mar 18 15:56:26 crc kubenswrapper[4696]: I0318 15:56:26.963851 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vb7xn-config-vjwc8" Mar 18 15:56:26 crc kubenswrapper[4696]: I0318 15:56:26.966545 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 15:56:26 crc kubenswrapper[4696]: I0318 15:56:26.978490 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vb7xn-config-vjwc8"] Mar 18 15:56:27 crc kubenswrapper[4696]: I0318 15:56:27.138162 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-var-log-ovn\") pod \"ovn-controller-vb7xn-config-vjwc8\" (UID: \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\") " pod="openstack/ovn-controller-vb7xn-config-vjwc8" Mar 18 15:56:27 crc kubenswrapper[4696]: I0318 15:56:27.138251 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-var-run-ovn\") pod \"ovn-controller-vb7xn-config-vjwc8\" (UID: \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\") " pod="openstack/ovn-controller-vb7xn-config-vjwc8" Mar 18 15:56:27 crc kubenswrapper[4696]: I0318 15:56:27.138276 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-var-run\") pod \"ovn-controller-vb7xn-config-vjwc8\" (UID: \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\") " pod="openstack/ovn-controller-vb7xn-config-vjwc8" Mar 18 15:56:27 crc kubenswrapper[4696]: I0318 15:56:27.138326 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-scripts\") pod \"ovn-controller-vb7xn-config-vjwc8\" (UID: \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\") " pod="openstack/ovn-controller-vb7xn-config-vjwc8" Mar 18 15:56:27 crc kubenswrapper[4696]: I0318 15:56:27.138381 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbvqs\" (UniqueName: \"kubernetes.io/projected/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-kube-api-access-rbvqs\") pod \"ovn-controller-vb7xn-config-vjwc8\" (UID: \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\") " pod="openstack/ovn-controller-vb7xn-config-vjwc8" Mar 18 15:56:27 crc kubenswrapper[4696]: I0318 15:56:27.138419 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-additional-scripts\") pod \"ovn-controller-vb7xn-config-vjwc8\" (UID: \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\") " pod="openstack/ovn-controller-vb7xn-config-vjwc8" Mar 18 15:56:27 crc kubenswrapper[4696]: I0318 15:56:27.240148 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-scripts\") pod \"ovn-controller-vb7xn-config-vjwc8\" (UID: \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\") " pod="openstack/ovn-controller-vb7xn-config-vjwc8" Mar 18 15:56:27 crc kubenswrapper[4696]: I0318 15:56:27.240252 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbvqs\" (UniqueName: \"kubernetes.io/projected/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-kube-api-access-rbvqs\") pod \"ovn-controller-vb7xn-config-vjwc8\" (UID: \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\") " pod="openstack/ovn-controller-vb7xn-config-vjwc8" Mar 18 15:56:27 crc kubenswrapper[4696]: I0318 15:56:27.240289 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-additional-scripts\") pod \"ovn-controller-vb7xn-config-vjwc8\" (UID: \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\") " pod="openstack/ovn-controller-vb7xn-config-vjwc8" Mar 18 15:56:27 crc kubenswrapper[4696]: I0318 15:56:27.240333 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-var-log-ovn\") pod \"ovn-controller-vb7xn-config-vjwc8\" (UID: \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\") " pod="openstack/ovn-controller-vb7xn-config-vjwc8" Mar 18 15:56:27 crc kubenswrapper[4696]: I0318 15:56:27.240372 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-var-run-ovn\") pod \"ovn-controller-vb7xn-config-vjwc8\" (UID: \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\") " pod="openstack/ovn-controller-vb7xn-config-vjwc8" Mar 18 15:56:27 crc kubenswrapper[4696]: I0318 15:56:27.240389 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-var-run\") pod \"ovn-controller-vb7xn-config-vjwc8\" (UID: \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\") " pod="openstack/ovn-controller-vb7xn-config-vjwc8" Mar 18 15:56:27 crc kubenswrapper[4696]: I0318 15:56:27.240860 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-var-run\") pod \"ovn-controller-vb7xn-config-vjwc8\" (UID: \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\") " pod="openstack/ovn-controller-vb7xn-config-vjwc8" Mar 18 15:56:27 crc kubenswrapper[4696]: I0318 15:56:27.241031 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-var-log-ovn\") pod \"ovn-controller-vb7xn-config-vjwc8\" (UID: \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\") " pod="openstack/ovn-controller-vb7xn-config-vjwc8" Mar 18 15:56:27 crc kubenswrapper[4696]: I0318 15:56:27.241100 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-var-run-ovn\") pod \"ovn-controller-vb7xn-config-vjwc8\" (UID: \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\") " pod="openstack/ovn-controller-vb7xn-config-vjwc8" Mar 18 15:56:27 crc kubenswrapper[4696]: I0318 15:56:27.241435 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-additional-scripts\") pod \"ovn-controller-vb7xn-config-vjwc8\" (UID: \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\") " pod="openstack/ovn-controller-vb7xn-config-vjwc8" Mar 18 15:56:27 crc kubenswrapper[4696]: I0318 15:56:27.243489 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-scripts\") pod \"ovn-controller-vb7xn-config-vjwc8\" (UID: \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\") " pod="openstack/ovn-controller-vb7xn-config-vjwc8" Mar 18 15:56:27 crc kubenswrapper[4696]: I0318 15:56:27.262414 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbvqs\" (UniqueName: \"kubernetes.io/projected/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-kube-api-access-rbvqs\") pod \"ovn-controller-vb7xn-config-vjwc8\" (UID: \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\") " pod="openstack/ovn-controller-vb7xn-config-vjwc8" Mar 18 15:56:27 crc kubenswrapper[4696]: I0318 15:56:27.308065 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vb7xn-config-vjwc8" Mar 18 15:56:27 crc kubenswrapper[4696]: E0318 15:56:27.360954 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Mar 18 15:56:27 crc kubenswrapper[4696]: E0318 15:56:27.361215 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8sp7m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-sfb9t_openstack(6cfc5851-8295-4c4d-8cb4-4c18f9827227): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:56:27 crc kubenswrapper[4696]: E0318 15:56:27.363187 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-sfb9t" podUID="6cfc5851-8295-4c4d-8cb4-4c18f9827227" Mar 18 15:56:27 crc kubenswrapper[4696]: E0318 15:56:27.416409 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-sfb9t" podUID="6cfc5851-8295-4c4d-8cb4-4c18f9827227" Mar 18 15:56:27 crc kubenswrapper[4696]: I0318 15:56:27.775904 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-g666r"] Mar 18 15:56:27 crc kubenswrapper[4696]: W0318 15:56:27.778483 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78ecc605_5dea_4131_9161_7adf4ba2db45.slice/crio-6cc0d9e42db6162254fa5e83c7ff6a921c859638c5d9c55b83977efdffe63914 WatchSource:0}: Error finding container 6cc0d9e42db6162254fa5e83c7ff6a921c859638c5d9c55b83977efdffe63914: Status 404 returned error can't find the container with id 6cc0d9e42db6162254fa5e83c7ff6a921c859638c5d9c55b83977efdffe63914 Mar 18 15:56:27 crc kubenswrapper[4696]: I0318 15:56:27.906110 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vb7xn-config-vjwc8"] Mar 18 15:56:27 crc kubenswrapper[4696]: W0318 15:56:27.911728 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c2fac8b_e62b_4eb6_a5fb_359f036f39b9.slice/crio-480d1c4cc15f5c8b82c87adc3ce58a60b980a62134fd2e32550c189073c27876 WatchSource:0}: Error finding container 480d1c4cc15f5c8b82c87adc3ce58a60b980a62134fd2e32550c189073c27876: Status 404 returned error can't find the container with id 480d1c4cc15f5c8b82c87adc3ce58a60b980a62134fd2e32550c189073c27876 Mar 18 15:56:28 crc kubenswrapper[4696]: I0318 15:56:28.427547 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vb7xn-config-vjwc8" event={"ID":"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9","Type":"ContainerStarted","Data":"17bf9d28029c28497e000dca6f63f89dbaef9b7ec63b3c3be0a9331401fe8c3f"} Mar 18 15:56:28 crc kubenswrapper[4696]: I0318 15:56:28.428083 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vb7xn-config-vjwc8" event={"ID":"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9","Type":"ContainerStarted","Data":"480d1c4cc15f5c8b82c87adc3ce58a60b980a62134fd2e32550c189073c27876"} Mar 18 15:56:28 crc kubenswrapper[4696]: I0318 15:56:28.431234 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acfc5351-8c75-4362-8e66-b9ade04d74eb","Type":"ContainerStarted","Data":"a7501f0c4c0aec3dc8fcaca68f389a3d09f50d0446646a49432bec2bb091ff9b"} Mar 18 15:56:28 crc kubenswrapper[4696]: I0318 15:56:28.431325 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acfc5351-8c75-4362-8e66-b9ade04d74eb","Type":"ContainerStarted","Data":"875214e727e1c5e5f6859a44b424d432266fbe03c7e19cca477508c9f1dee88d"} Mar 18 15:56:28 crc kubenswrapper[4696]: I0318 15:56:28.431342 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acfc5351-8c75-4362-8e66-b9ade04d74eb","Type":"ContainerStarted","Data":"7f8729819fbabced155a873cc7fc9d9b2b141690b8aa1b618922af73e106dc66"} Mar 18 15:56:28 crc kubenswrapper[4696]: I0318 15:56:28.433970 4696 generic.go:334] "Generic (PLEG): container finished" podID="78ecc605-5dea-4131-9161-7adf4ba2db45" containerID="e358bac5c31b29168432e7202220daa7e2674d485b8ecfa9f8e1a2f5a6fb25f1" exitCode=0 Mar 18 15:56:28 crc kubenswrapper[4696]: I0318 15:56:28.434072 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-g666r" event={"ID":"78ecc605-5dea-4131-9161-7adf4ba2db45","Type":"ContainerDied","Data":"e358bac5c31b29168432e7202220daa7e2674d485b8ecfa9f8e1a2f5a6fb25f1"} Mar 18 15:56:28 crc kubenswrapper[4696]: I0318 15:56:28.434176 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-g666r" event={"ID":"78ecc605-5dea-4131-9161-7adf4ba2db45","Type":"ContainerStarted","Data":"6cc0d9e42db6162254fa5e83c7ff6a921c859638c5d9c55b83977efdffe63914"} Mar 18 15:56:28 crc kubenswrapper[4696]: I0318 15:56:28.457911 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-vb7xn-config-vjwc8" podStartSLOduration=2.457881762 podStartE2EDuration="2.457881762s" podCreationTimestamp="2026-03-18 15:56:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:56:28.450379024 +0000 UTC m=+1231.456553230" watchObservedRunningTime="2026-03-18 15:56:28.457881762 +0000 UTC m=+1231.464055978" Mar 18 15:56:29 crc kubenswrapper[4696]: I0318 15:56:29.449296 4696 generic.go:334] "Generic (PLEG): container finished" podID="9c2fac8b-e62b-4eb6-a5fb-359f036f39b9" containerID="17bf9d28029c28497e000dca6f63f89dbaef9b7ec63b3c3be0a9331401fe8c3f" exitCode=0 Mar 18 15:56:29 crc kubenswrapper[4696]: I0318 15:56:29.449974 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vb7xn-config-vjwc8" event={"ID":"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9","Type":"ContainerDied","Data":"17bf9d28029c28497e000dca6f63f89dbaef9b7ec63b3c3be0a9331401fe8c3f"} Mar 18 15:56:29 crc kubenswrapper[4696]: I0318 15:56:29.463277 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acfc5351-8c75-4362-8e66-b9ade04d74eb","Type":"ContainerStarted","Data":"a93a59c667fc4b5b1db7a413cd3a2eda18b33956ef0dc1b49938c0dda2222a3e"} Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:29.794396 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-g666r" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:29.815892 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnqdz\" (UniqueName: \"kubernetes.io/projected/78ecc605-5dea-4131-9161-7adf4ba2db45-kube-api-access-lnqdz\") pod \"78ecc605-5dea-4131-9161-7adf4ba2db45\" (UID: \"78ecc605-5dea-4131-9161-7adf4ba2db45\") " Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:29.816073 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78ecc605-5dea-4131-9161-7adf4ba2db45-operator-scripts\") pod \"78ecc605-5dea-4131-9161-7adf4ba2db45\" (UID: \"78ecc605-5dea-4131-9161-7adf4ba2db45\") " Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:29.817427 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78ecc605-5dea-4131-9161-7adf4ba2db45-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "78ecc605-5dea-4131-9161-7adf4ba2db45" (UID: "78ecc605-5dea-4131-9161-7adf4ba2db45"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:29.825514 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78ecc605-5dea-4131-9161-7adf4ba2db45-kube-api-access-lnqdz" (OuterVolumeSpecName: "kube-api-access-lnqdz") pod "78ecc605-5dea-4131-9161-7adf4ba2db45" (UID: "78ecc605-5dea-4131-9161-7adf4ba2db45"). InnerVolumeSpecName "kube-api-access-lnqdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:29.917609 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnqdz\" (UniqueName: \"kubernetes.io/projected/78ecc605-5dea-4131-9161-7adf4ba2db45-kube-api-access-lnqdz\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:29.917637 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78ecc605-5dea-4131-9161-7adf4ba2db45-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:30.482050 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acfc5351-8c75-4362-8e66-b9ade04d74eb","Type":"ContainerStarted","Data":"204f532964faa3aa28de019d0733f9852d9187bd9e3d087fd488ba7168894840"} Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:30.482564 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acfc5351-8c75-4362-8e66-b9ade04d74eb","Type":"ContainerStarted","Data":"28d85205bad420083e7bfb5a322374cad5e1dd69f2a196aff734b978f23cd7e2"} Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:30.482583 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acfc5351-8c75-4362-8e66-b9ade04d74eb","Type":"ContainerStarted","Data":"25bb1b373adf56a51281cad2e11d34a3ba394422750698aa81100398f5d51390"} Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:30.484502 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-g666r" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:30.486635 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-g666r" event={"ID":"78ecc605-5dea-4131-9161-7adf4ba2db45","Type":"ContainerDied","Data":"6cc0d9e42db6162254fa5e83c7ff6a921c859638c5d9c55b83977efdffe63914"} Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:30.486671 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cc0d9e42db6162254fa5e83c7ff6a921c859638c5d9c55b83977efdffe63914" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:31.724250 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-vb7xn" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:32.012719 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:32.390718 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:33.998876 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-pbhp6"] Mar 18 15:56:34 crc kubenswrapper[4696]: E0318 15:56:33.999684 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78ecc605-5dea-4131-9161-7adf4ba2db45" containerName="mariadb-account-create-update" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:33.999700 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ecc605-5dea-4131-9161-7adf4ba2db45" containerName="mariadb-account-create-update" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:33.999895 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="78ecc605-5dea-4131-9161-7adf4ba2db45" containerName="mariadb-account-create-update" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.000489 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-pbhp6" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.017211 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-pbhp6"] Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.109139 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22b3c462-44a9-4899-a39b-463eda7dd5d0-operator-scripts\") pod \"cinder-db-create-pbhp6\" (UID: \"22b3c462-44a9-4899-a39b-463eda7dd5d0\") " pod="openstack/cinder-db-create-pbhp6" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.109281 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8q94\" (UniqueName: \"kubernetes.io/projected/22b3c462-44a9-4899-a39b-463eda7dd5d0-kube-api-access-p8q94\") pod \"cinder-db-create-pbhp6\" (UID: \"22b3c462-44a9-4899-a39b-463eda7dd5d0\") " pod="openstack/cinder-db-create-pbhp6" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.118157 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-97fc-account-create-update-xgqkx"] Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.119613 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-97fc-account-create-update-xgqkx" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.124351 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.135096 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-97fc-account-create-update-xgqkx"] Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.214585 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcx98\" (UniqueName: \"kubernetes.io/projected/0788afc4-c079-463b-8d56-3a6be70dbf51-kube-api-access-jcx98\") pod \"barbican-97fc-account-create-update-xgqkx\" (UID: \"0788afc4-c079-463b-8d56-3a6be70dbf51\") " pod="openstack/barbican-97fc-account-create-update-xgqkx" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.214685 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8q94\" (UniqueName: \"kubernetes.io/projected/22b3c462-44a9-4899-a39b-463eda7dd5d0-kube-api-access-p8q94\") pod \"cinder-db-create-pbhp6\" (UID: \"22b3c462-44a9-4899-a39b-463eda7dd5d0\") " pod="openstack/cinder-db-create-pbhp6" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.214752 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22b3c462-44a9-4899-a39b-463eda7dd5d0-operator-scripts\") pod \"cinder-db-create-pbhp6\" (UID: \"22b3c462-44a9-4899-a39b-463eda7dd5d0\") " pod="openstack/cinder-db-create-pbhp6" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.214790 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0788afc4-c079-463b-8d56-3a6be70dbf51-operator-scripts\") pod \"barbican-97fc-account-create-update-xgqkx\" (UID: \"0788afc4-c079-463b-8d56-3a6be70dbf51\") " pod="openstack/barbican-97fc-account-create-update-xgqkx" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.215936 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22b3c462-44a9-4899-a39b-463eda7dd5d0-operator-scripts\") pod \"cinder-db-create-pbhp6\" (UID: \"22b3c462-44a9-4899-a39b-463eda7dd5d0\") " pod="openstack/cinder-db-create-pbhp6" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.223748 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-12ff-account-create-update-mqz6l"] Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.225215 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-12ff-account-create-update-mqz6l" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.231047 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.258196 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8q94\" (UniqueName: \"kubernetes.io/projected/22b3c462-44a9-4899-a39b-463eda7dd5d0-kube-api-access-p8q94\") pod \"cinder-db-create-pbhp6\" (UID: \"22b3c462-44a9-4899-a39b-463eda7dd5d0\") " pod="openstack/cinder-db-create-pbhp6" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.276598 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-12ff-account-create-update-mqz6l"] Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.312353 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-9jmt6"] Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.314168 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9jmt6" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.321169 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pdpjw" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.321434 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.321693 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.322004 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kts9w\" (UniqueName: \"kubernetes.io/projected/370a3d71-3446-40b6-80ed-7efdb61a36a5-kube-api-access-kts9w\") pod \"cinder-12ff-account-create-update-mqz6l\" (UID: \"370a3d71-3446-40b6-80ed-7efdb61a36a5\") " pod="openstack/cinder-12ff-account-create-update-mqz6l" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.322106 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/370a3d71-3446-40b6-80ed-7efdb61a36a5-operator-scripts\") pod \"cinder-12ff-account-create-update-mqz6l\" (UID: \"370a3d71-3446-40b6-80ed-7efdb61a36a5\") " pod="openstack/cinder-12ff-account-create-update-mqz6l" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.322174 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0788afc4-c079-463b-8d56-3a6be70dbf51-operator-scripts\") pod \"barbican-97fc-account-create-update-xgqkx\" (UID: \"0788afc4-c079-463b-8d56-3a6be70dbf51\") " pod="openstack/barbican-97fc-account-create-update-xgqkx" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.322715 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcx98\" (UniqueName: \"kubernetes.io/projected/0788afc4-c079-463b-8d56-3a6be70dbf51-kube-api-access-jcx98\") pod \"barbican-97fc-account-create-update-xgqkx\" (UID: \"0788afc4-c079-463b-8d56-3a6be70dbf51\") " pod="openstack/barbican-97fc-account-create-update-xgqkx" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.323006 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.323095 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-z72n4"] Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.323785 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0788afc4-c079-463b-8d56-3a6be70dbf51-operator-scripts\") pod \"barbican-97fc-account-create-update-xgqkx\" (UID: \"0788afc4-c079-463b-8d56-3a6be70dbf51\") " pod="openstack/barbican-97fc-account-create-update-xgqkx" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.326940 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-z72n4" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.335645 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-pbhp6" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.347453 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcx98\" (UniqueName: \"kubernetes.io/projected/0788afc4-c079-463b-8d56-3a6be70dbf51-kube-api-access-jcx98\") pod \"barbican-97fc-account-create-update-xgqkx\" (UID: \"0788afc4-c079-463b-8d56-3a6be70dbf51\") " pod="openstack/barbican-97fc-account-create-update-xgqkx" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.347562 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-9jmt6"] Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.355331 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-z72n4"] Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.422354 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-twrd5"] Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.423804 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-twrd5" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.424504 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c5159e-3018-4c1b-8f1c-b40e157d043b-combined-ca-bundle\") pod \"keystone-db-sync-9jmt6\" (UID: \"c6c5159e-3018-4c1b-8f1c-b40e157d043b\") " pod="openstack/keystone-db-sync-9jmt6" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.424704 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c-operator-scripts\") pod \"barbican-db-create-z72n4\" (UID: \"7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c\") " pod="openstack/barbican-db-create-z72n4" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.424739 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csv29\" (UniqueName: \"kubernetes.io/projected/c6c5159e-3018-4c1b-8f1c-b40e157d043b-kube-api-access-csv29\") pod \"keystone-db-sync-9jmt6\" (UID: \"c6c5159e-3018-4c1b-8f1c-b40e157d043b\") " pod="openstack/keystone-db-sync-9jmt6" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.424843 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6c5159e-3018-4c1b-8f1c-b40e157d043b-config-data\") pod \"keystone-db-sync-9jmt6\" (UID: \"c6c5159e-3018-4c1b-8f1c-b40e157d043b\") " pod="openstack/keystone-db-sync-9jmt6" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.424898 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kts9w\" (UniqueName: \"kubernetes.io/projected/370a3d71-3446-40b6-80ed-7efdb61a36a5-kube-api-access-kts9w\") pod \"cinder-12ff-account-create-update-mqz6l\" (UID: \"370a3d71-3446-40b6-80ed-7efdb61a36a5\") " pod="openstack/cinder-12ff-account-create-update-mqz6l" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.424935 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbqfw\" (UniqueName: \"kubernetes.io/projected/7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c-kube-api-access-zbqfw\") pod \"barbican-db-create-z72n4\" (UID: \"7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c\") " pod="openstack/barbican-db-create-z72n4" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.424969 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/370a3d71-3446-40b6-80ed-7efdb61a36a5-operator-scripts\") pod \"cinder-12ff-account-create-update-mqz6l\" (UID: \"370a3d71-3446-40b6-80ed-7efdb61a36a5\") " pod="openstack/cinder-12ff-account-create-update-mqz6l" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.426886 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/370a3d71-3446-40b6-80ed-7efdb61a36a5-operator-scripts\") pod \"cinder-12ff-account-create-update-mqz6l\" (UID: \"370a3d71-3446-40b6-80ed-7efdb61a36a5\") " pod="openstack/cinder-12ff-account-create-update-mqz6l" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.448326 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-97fc-account-create-update-xgqkx" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.452532 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kts9w\" (UniqueName: \"kubernetes.io/projected/370a3d71-3446-40b6-80ed-7efdb61a36a5-kube-api-access-kts9w\") pod \"cinder-12ff-account-create-update-mqz6l\" (UID: \"370a3d71-3446-40b6-80ed-7efdb61a36a5\") " pod="openstack/cinder-12ff-account-create-update-mqz6l" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.457296 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-twrd5"] Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.527374 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-0180-account-create-update-dk749"] Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.527557 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6c5159e-3018-4c1b-8f1c-b40e157d043b-config-data\") pod \"keystone-db-sync-9jmt6\" (UID: \"c6c5159e-3018-4c1b-8f1c-b40e157d043b\") " pod="openstack/keystone-db-sync-9jmt6" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.527615 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbqfw\" (UniqueName: \"kubernetes.io/projected/7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c-kube-api-access-zbqfw\") pod \"barbican-db-create-z72n4\" (UID: \"7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c\") " pod="openstack/barbican-db-create-z72n4" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.527660 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b834ff85-81d1-4a20-9f59-0790a7492dfc-operator-scripts\") pod \"neutron-db-create-twrd5\" (UID: \"b834ff85-81d1-4a20-9f59-0790a7492dfc\") " pod="openstack/neutron-db-create-twrd5" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.527711 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gqqz\" (UniqueName: \"kubernetes.io/projected/b834ff85-81d1-4a20-9f59-0790a7492dfc-kube-api-access-8gqqz\") pod \"neutron-db-create-twrd5\" (UID: \"b834ff85-81d1-4a20-9f59-0790a7492dfc\") " pod="openstack/neutron-db-create-twrd5" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.527745 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c5159e-3018-4c1b-8f1c-b40e157d043b-combined-ca-bundle\") pod \"keystone-db-sync-9jmt6\" (UID: \"c6c5159e-3018-4c1b-8f1c-b40e157d043b\") " pod="openstack/keystone-db-sync-9jmt6" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.527775 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c-operator-scripts\") pod \"barbican-db-create-z72n4\" (UID: \"7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c\") " pod="openstack/barbican-db-create-z72n4" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.527796 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csv29\" (UniqueName: \"kubernetes.io/projected/c6c5159e-3018-4c1b-8f1c-b40e157d043b-kube-api-access-csv29\") pod \"keystone-db-sync-9jmt6\" (UID: \"c6c5159e-3018-4c1b-8f1c-b40e157d043b\") " pod="openstack/keystone-db-sync-9jmt6" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.529487 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0180-account-create-update-dk749" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.533007 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.535008 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c-operator-scripts\") pod \"barbican-db-create-z72n4\" (UID: \"7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c\") " pod="openstack/barbican-db-create-z72n4" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.536730 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6c5159e-3018-4c1b-8f1c-b40e157d043b-config-data\") pod \"keystone-db-sync-9jmt6\" (UID: \"c6c5159e-3018-4c1b-8f1c-b40e157d043b\") " pod="openstack/keystone-db-sync-9jmt6" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.550986 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0180-account-create-update-dk749"] Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.551122 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c5159e-3018-4c1b-8f1c-b40e157d043b-combined-ca-bundle\") pod \"keystone-db-sync-9jmt6\" (UID: \"c6c5159e-3018-4c1b-8f1c-b40e157d043b\") " pod="openstack/keystone-db-sync-9jmt6" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.552266 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vb7xn-config-vjwc8" event={"ID":"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9","Type":"ContainerDied","Data":"480d1c4cc15f5c8b82c87adc3ce58a60b980a62134fd2e32550c189073c27876"} Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.552315 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="480d1c4cc15f5c8b82c87adc3ce58a60b980a62134fd2e32550c189073c27876" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.556671 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vb7xn-config-vjwc8" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.558287 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbqfw\" (UniqueName: \"kubernetes.io/projected/7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c-kube-api-access-zbqfw\") pod \"barbican-db-create-z72n4\" (UID: \"7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c\") " pod="openstack/barbican-db-create-z72n4" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.572153 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csv29\" (UniqueName: \"kubernetes.io/projected/c6c5159e-3018-4c1b-8f1c-b40e157d043b-kube-api-access-csv29\") pod \"keystone-db-sync-9jmt6\" (UID: \"c6c5159e-3018-4c1b-8f1c-b40e157d043b\") " pod="openstack/keystone-db-sync-9jmt6" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.599958 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-12ff-account-create-update-mqz6l" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.629698 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b834ff85-81d1-4a20-9f59-0790a7492dfc-operator-scripts\") pod \"neutron-db-create-twrd5\" (UID: \"b834ff85-81d1-4a20-9f59-0790a7492dfc\") " pod="openstack/neutron-db-create-twrd5" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.631323 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb0c55b7-1488-4ae0-8d27-d063761edde5-operator-scripts\") pod \"neutron-0180-account-create-update-dk749\" (UID: \"cb0c55b7-1488-4ae0-8d27-d063761edde5\") " pod="openstack/neutron-0180-account-create-update-dk749" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.631221 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b834ff85-81d1-4a20-9f59-0790a7492dfc-operator-scripts\") pod \"neutron-db-create-twrd5\" (UID: \"b834ff85-81d1-4a20-9f59-0790a7492dfc\") " pod="openstack/neutron-db-create-twrd5" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.631388 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gqqz\" (UniqueName: \"kubernetes.io/projected/b834ff85-81d1-4a20-9f59-0790a7492dfc-kube-api-access-8gqqz\") pod \"neutron-db-create-twrd5\" (UID: \"b834ff85-81d1-4a20-9f59-0790a7492dfc\") " pod="openstack/neutron-db-create-twrd5" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.631469 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4v9z\" (UniqueName: \"kubernetes.io/projected/cb0c55b7-1488-4ae0-8d27-d063761edde5-kube-api-access-j4v9z\") pod \"neutron-0180-account-create-update-dk749\" (UID: \"cb0c55b7-1488-4ae0-8d27-d063761edde5\") " pod="openstack/neutron-0180-account-create-update-dk749" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.659633 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gqqz\" (UniqueName: \"kubernetes.io/projected/b834ff85-81d1-4a20-9f59-0790a7492dfc-kube-api-access-8gqqz\") pod \"neutron-db-create-twrd5\" (UID: \"b834ff85-81d1-4a20-9f59-0790a7492dfc\") " pod="openstack/neutron-db-create-twrd5" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.733585 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-var-log-ovn\") pod \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\" (UID: \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\") " Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.733634 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-var-run-ovn\") pod \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\" (UID: \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\") " Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.733725 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-additional-scripts\") pod \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\" (UID: \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\") " Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.733816 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-var-run\") pod \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\" (UID: \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\") " Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.733843 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "9c2fac8b-e62b-4eb6-a5fb-359f036f39b9" (UID: "9c2fac8b-e62b-4eb6-a5fb-359f036f39b9"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.733858 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbvqs\" (UniqueName: \"kubernetes.io/projected/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-kube-api-access-rbvqs\") pod \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\" (UID: \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\") " Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.733933 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-scripts\") pod \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\" (UID: \"9c2fac8b-e62b-4eb6-a5fb-359f036f39b9\") " Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.734649 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb0c55b7-1488-4ae0-8d27-d063761edde5-operator-scripts\") pod \"neutron-0180-account-create-update-dk749\" (UID: \"cb0c55b7-1488-4ae0-8d27-d063761edde5\") " pod="openstack/neutron-0180-account-create-update-dk749" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.734734 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4v9z\" (UniqueName: \"kubernetes.io/projected/cb0c55b7-1488-4ae0-8d27-d063761edde5-kube-api-access-j4v9z\") pod \"neutron-0180-account-create-update-dk749\" (UID: \"cb0c55b7-1488-4ae0-8d27-d063761edde5\") " pod="openstack/neutron-0180-account-create-update-dk749" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.734841 4696 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.736624 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-scripts" (OuterVolumeSpecName: "scripts") pod "9c2fac8b-e62b-4eb6-a5fb-359f036f39b9" (UID: "9c2fac8b-e62b-4eb6-a5fb-359f036f39b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.737428 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb0c55b7-1488-4ae0-8d27-d063761edde5-operator-scripts\") pod \"neutron-0180-account-create-update-dk749\" (UID: \"cb0c55b7-1488-4ae0-8d27-d063761edde5\") " pod="openstack/neutron-0180-account-create-update-dk749" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.739456 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "9c2fac8b-e62b-4eb6-a5fb-359f036f39b9" (UID: "9c2fac8b-e62b-4eb6-a5fb-359f036f39b9"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.739497 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "9c2fac8b-e62b-4eb6-a5fb-359f036f39b9" (UID: "9c2fac8b-e62b-4eb6-a5fb-359f036f39b9"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.739538 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-var-run" (OuterVolumeSpecName: "var-run") pod "9c2fac8b-e62b-4eb6-a5fb-359f036f39b9" (UID: "9c2fac8b-e62b-4eb6-a5fb-359f036f39b9"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.740728 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9jmt6" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.741563 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-kube-api-access-rbvqs" (OuterVolumeSpecName: "kube-api-access-rbvqs") pod "9c2fac8b-e62b-4eb6-a5fb-359f036f39b9" (UID: "9c2fac8b-e62b-4eb6-a5fb-359f036f39b9"). InnerVolumeSpecName "kube-api-access-rbvqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.748415 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-pbhp6"] Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.748826 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-z72n4" Mar 18 15:56:34 crc kubenswrapper[4696]: W0318 15:56:34.764469 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22b3c462_44a9_4899_a39b_463eda7dd5d0.slice/crio-a6d31607a0c6da807f995a56d5073d72ae0b3ff60b775bbf30e6d136e2ce5664 WatchSource:0}: Error finding container a6d31607a0c6da807f995a56d5073d72ae0b3ff60b775bbf30e6d136e2ce5664: Status 404 returned error can't find the container with id a6d31607a0c6da807f995a56d5073d72ae0b3ff60b775bbf30e6d136e2ce5664 Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.764609 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4v9z\" (UniqueName: \"kubernetes.io/projected/cb0c55b7-1488-4ae0-8d27-d063761edde5-kube-api-access-j4v9z\") pod \"neutron-0180-account-create-update-dk749\" (UID: \"cb0c55b7-1488-4ae0-8d27-d063761edde5\") " pod="openstack/neutron-0180-account-create-update-dk749" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.835939 4696 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.835976 4696 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.835986 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbvqs\" (UniqueName: \"kubernetes.io/projected/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-kube-api-access-rbvqs\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.835995 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.836005 4696 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.843508 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-97fc-account-create-update-xgqkx"] Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.851804 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-twrd5" Mar 18 15:56:34 crc kubenswrapper[4696]: I0318 15:56:34.884752 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0180-account-create-update-dk749" Mar 18 15:56:35 crc kubenswrapper[4696]: I0318 15:56:35.199662 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-12ff-account-create-update-mqz6l"] Mar 18 15:56:35 crc kubenswrapper[4696]: I0318 15:56:35.562810 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-pbhp6" event={"ID":"22b3c462-44a9-4899-a39b-463eda7dd5d0","Type":"ContainerStarted","Data":"a6d31607a0c6da807f995a56d5073d72ae0b3ff60b775bbf30e6d136e2ce5664"} Mar 18 15:56:35 crc kubenswrapper[4696]: I0318 15:56:35.565793 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-12ff-account-create-update-mqz6l" event={"ID":"370a3d71-3446-40b6-80ed-7efdb61a36a5","Type":"ContainerStarted","Data":"c34fb0267240a68d0c17860964e2490f31bdafd48f85299590db79064af8401a"} Mar 18 15:56:35 crc kubenswrapper[4696]: I0318 15:56:35.574510 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vb7xn-config-vjwc8" Mar 18 15:56:35 crc kubenswrapper[4696]: I0318 15:56:35.575559 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-97fc-account-create-update-xgqkx" event={"ID":"0788afc4-c079-463b-8d56-3a6be70dbf51","Type":"ContainerStarted","Data":"823596f77dbd226ddaf3eb2ecfa26f27e6c92405bdf24d6c4844cdc6fb4a31b4"} Mar 18 15:56:35 crc kubenswrapper[4696]: I0318 15:56:35.704939 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-vb7xn-config-vjwc8"] Mar 18 15:56:35 crc kubenswrapper[4696]: I0318 15:56:35.731027 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-vb7xn-config-vjwc8"] Mar 18 15:56:35 crc kubenswrapper[4696]: I0318 15:56:35.837028 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-9jmt6"] Mar 18 15:56:35 crc kubenswrapper[4696]: I0318 15:56:35.878931 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-z72n4"] Mar 18 15:56:35 crc kubenswrapper[4696]: W0318 15:56:35.982995 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb0c55b7_1488_4ae0_8d27_d063761edde5.slice/crio-b066cd1c9301e5ed50383124038d0fbde9c233d2d810d165e73f47167e6f861f WatchSource:0}: Error finding container b066cd1c9301e5ed50383124038d0fbde9c233d2d810d165e73f47167e6f861f: Status 404 returned error can't find the container with id b066cd1c9301e5ed50383124038d0fbde9c233d2d810d165e73f47167e6f861f Mar 18 15:56:35 crc kubenswrapper[4696]: W0318 15:56:35.990086 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb834ff85_81d1_4a20_9f59_0790a7492dfc.slice/crio-3c89bbcf7636acf00982f7bb6949315a708bf2219b5f95b3d3aaab4d3743ef6b WatchSource:0}: Error finding container 3c89bbcf7636acf00982f7bb6949315a708bf2219b5f95b3d3aaab4d3743ef6b: Status 404 returned error can't find the container with id 3c89bbcf7636acf00982f7bb6949315a708bf2219b5f95b3d3aaab4d3743ef6b Mar 18 15:56:35 crc kubenswrapper[4696]: I0318 15:56:35.991550 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0180-account-create-update-dk749"] Mar 18 15:56:36 crc kubenswrapper[4696]: I0318 15:56:36.003002 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-twrd5"] Mar 18 15:56:36 crc kubenswrapper[4696]: I0318 15:56:36.584162 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-twrd5" event={"ID":"b834ff85-81d1-4a20-9f59-0790a7492dfc","Type":"ContainerStarted","Data":"6ced00f98ca046024dae8e9ce6bdae0071fa5ae95596ed961d36eb7f76d42250"} Mar 18 15:56:36 crc kubenswrapper[4696]: I0318 15:56:36.584273 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-twrd5" event={"ID":"b834ff85-81d1-4a20-9f59-0790a7492dfc","Type":"ContainerStarted","Data":"3c89bbcf7636acf00982f7bb6949315a708bf2219b5f95b3d3aaab4d3743ef6b"} Mar 18 15:56:36 crc kubenswrapper[4696]: I0318 15:56:36.591597 4696 generic.go:334] "Generic (PLEG): container finished" podID="7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c" containerID="5422047cd85de969300b22e3f45d67a48493b53b10f136aeee116bac24add09a" exitCode=0 Mar 18 15:56:36 crc kubenswrapper[4696]: I0318 15:56:36.591639 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-z72n4" event={"ID":"7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c","Type":"ContainerDied","Data":"5422047cd85de969300b22e3f45d67a48493b53b10f136aeee116bac24add09a"} Mar 18 15:56:36 crc kubenswrapper[4696]: I0318 15:56:36.591675 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-z72n4" event={"ID":"7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c","Type":"ContainerStarted","Data":"977a9a7515b92b19bd68c027f9594583eee579cfd61eef7703e0a7f5d23197a0"} Mar 18 15:56:36 crc kubenswrapper[4696]: I0318 15:56:36.612820 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-twrd5" podStartSLOduration=2.612798757 podStartE2EDuration="2.612798757s" podCreationTimestamp="2026-03-18 15:56:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:56:36.597912634 +0000 UTC m=+1239.604086840" watchObservedRunningTime="2026-03-18 15:56:36.612798757 +0000 UTC m=+1239.618972963" Mar 18 15:56:36 crc kubenswrapper[4696]: I0318 15:56:36.627009 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acfc5351-8c75-4362-8e66-b9ade04d74eb","Type":"ContainerStarted","Data":"60bd35b7e791a23a7f441db9892949ee154af151505231d7a58de9a9bc304ed5"} Mar 18 15:56:36 crc kubenswrapper[4696]: I0318 15:56:36.627063 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acfc5351-8c75-4362-8e66-b9ade04d74eb","Type":"ContainerStarted","Data":"e29fc99253dc8bd3db4249baf57ccc066e2b6c6646279eb38e521437e038fc4c"} Mar 18 15:56:36 crc kubenswrapper[4696]: I0318 15:56:36.627072 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acfc5351-8c75-4362-8e66-b9ade04d74eb","Type":"ContainerStarted","Data":"532567b82faadec4c1ca2f99d1a16d9921e6bc6a05ddc3a33f647a78e3292d87"} Mar 18 15:56:36 crc kubenswrapper[4696]: I0318 15:56:36.631650 4696 generic.go:334] "Generic (PLEG): container finished" podID="370a3d71-3446-40b6-80ed-7efdb61a36a5" containerID="178a3592ec78d7aebcb36d5c68a0222e89f924f5d132570b1f4ab19d655171a6" exitCode=0 Mar 18 15:56:36 crc kubenswrapper[4696]: I0318 15:56:36.631729 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-12ff-account-create-update-mqz6l" event={"ID":"370a3d71-3446-40b6-80ed-7efdb61a36a5","Type":"ContainerDied","Data":"178a3592ec78d7aebcb36d5c68a0222e89f924f5d132570b1f4ab19d655171a6"} Mar 18 15:56:36 crc kubenswrapper[4696]: I0318 15:56:36.641029 4696 generic.go:334] "Generic (PLEG): container finished" podID="0788afc4-c079-463b-8d56-3a6be70dbf51" containerID="c7e497d3a38b51966753a5b7a8887757487d9addbb5506013a45052362a2a504" exitCode=0 Mar 18 15:56:36 crc kubenswrapper[4696]: I0318 15:56:36.641160 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-97fc-account-create-update-xgqkx" event={"ID":"0788afc4-c079-463b-8d56-3a6be70dbf51","Type":"ContainerDied","Data":"c7e497d3a38b51966753a5b7a8887757487d9addbb5506013a45052362a2a504"} Mar 18 15:56:36 crc kubenswrapper[4696]: I0318 15:56:36.645372 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9jmt6" event={"ID":"c6c5159e-3018-4c1b-8f1c-b40e157d043b","Type":"ContainerStarted","Data":"1eefba6279e8ad10c7770a7fc53b0ea9b7ee543e6116641335d1f2e0bd2733c0"} Mar 18 15:56:36 crc kubenswrapper[4696]: I0318 15:56:36.656423 4696 generic.go:334] "Generic (PLEG): container finished" podID="22b3c462-44a9-4899-a39b-463eda7dd5d0" containerID="c8708acc6cf893d512b0b488bdc35b0d9de8fe29b6fcdbf4bad6511fe1dcb787" exitCode=0 Mar 18 15:56:36 crc kubenswrapper[4696]: I0318 15:56:36.656604 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-pbhp6" event={"ID":"22b3c462-44a9-4899-a39b-463eda7dd5d0","Type":"ContainerDied","Data":"c8708acc6cf893d512b0b488bdc35b0d9de8fe29b6fcdbf4bad6511fe1dcb787"} Mar 18 15:56:36 crc kubenswrapper[4696]: I0318 15:56:36.658788 4696 generic.go:334] "Generic (PLEG): container finished" podID="cb0c55b7-1488-4ae0-8d27-d063761edde5" containerID="27865e2af2387aeadd20105cbe4941b24c3d43b80e358bf96cc8c0de2ad19f8d" exitCode=0 Mar 18 15:56:36 crc kubenswrapper[4696]: I0318 15:56:36.658824 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0180-account-create-update-dk749" event={"ID":"cb0c55b7-1488-4ae0-8d27-d063761edde5","Type":"ContainerDied","Data":"27865e2af2387aeadd20105cbe4941b24c3d43b80e358bf96cc8c0de2ad19f8d"} Mar 18 15:56:36 crc kubenswrapper[4696]: I0318 15:56:36.658846 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0180-account-create-update-dk749" event={"ID":"cb0c55b7-1488-4ae0-8d27-d063761edde5","Type":"ContainerStarted","Data":"b066cd1c9301e5ed50383124038d0fbde9c233d2d810d165e73f47167e6f861f"} Mar 18 15:56:37 crc kubenswrapper[4696]: I0318 15:56:37.614400 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c2fac8b-e62b-4eb6-a5fb-359f036f39b9" path="/var/lib/kubelet/pods/9c2fac8b-e62b-4eb6-a5fb-359f036f39b9/volumes" Mar 18 15:56:37 crc kubenswrapper[4696]: I0318 15:56:37.673224 4696 generic.go:334] "Generic (PLEG): container finished" podID="b834ff85-81d1-4a20-9f59-0790a7492dfc" containerID="6ced00f98ca046024dae8e9ce6bdae0071fa5ae95596ed961d36eb7f76d42250" exitCode=0 Mar 18 15:56:37 crc kubenswrapper[4696]: I0318 15:56:37.673299 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-twrd5" event={"ID":"b834ff85-81d1-4a20-9f59-0790a7492dfc","Type":"ContainerDied","Data":"6ced00f98ca046024dae8e9ce6bdae0071fa5ae95596ed961d36eb7f76d42250"} Mar 18 15:56:37 crc kubenswrapper[4696]: I0318 15:56:37.685224 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acfc5351-8c75-4362-8e66-b9ade04d74eb","Type":"ContainerStarted","Data":"c386695e00c9eb644920b580d1cd773814f21be34164d812e6f1363646a62a01"} Mar 18 15:56:37 crc kubenswrapper[4696]: I0318 15:56:37.685265 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acfc5351-8c75-4362-8e66-b9ade04d74eb","Type":"ContainerStarted","Data":"32b1dcded5f65580d79f673ce3d69cfd1fb1ba13734a8b2ddd0e8e849501e34b"} Mar 18 15:56:37 crc kubenswrapper[4696]: I0318 15:56:37.685277 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acfc5351-8c75-4362-8e66-b9ade04d74eb","Type":"ContainerStarted","Data":"20b8c084189df12f921aa89b7b3f8391617fced46dbba8cfea7839df2f9c418f"} Mar 18 15:56:37 crc kubenswrapper[4696]: I0318 15:56:37.685287 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"acfc5351-8c75-4362-8e66-b9ade04d74eb","Type":"ContainerStarted","Data":"db53b49dde9050cf010a4cdd88190b9b4beacff04f30ce6796ce9969bc0930c3"} Mar 18 15:56:37 crc kubenswrapper[4696]: I0318 15:56:37.754954 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.507356035 podStartE2EDuration="40.754923256s" podCreationTimestamp="2026-03-18 15:55:57 +0000 UTC" firstStartedPulling="2026-03-18 15:56:15.147063164 +0000 UTC m=+1218.153237360" lastFinishedPulling="2026-03-18 15:56:35.394630385 +0000 UTC m=+1238.400804581" observedRunningTime="2026-03-18 15:56:37.74751384 +0000 UTC m=+1240.753688066" watchObservedRunningTime="2026-03-18 15:56:37.754923256 +0000 UTC m=+1240.761097472" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.059095 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-nt7xg"] Mar 18 15:56:38 crc kubenswrapper[4696]: E0318 15:56:38.059498 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c2fac8b-e62b-4eb6-a5fb-359f036f39b9" containerName="ovn-config" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.059511 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c2fac8b-e62b-4eb6-a5fb-359f036f39b9" containerName="ovn-config" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.063964 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c2fac8b-e62b-4eb6-a5fb-359f036f39b9" containerName="ovn-config" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.065055 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.076833 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.080857 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-nt7xg"] Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.276225 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-config\") pod \"dnsmasq-dns-5c79d794d7-nt7xg\" (UID: \"9514b639-882b-4ae4-ad29-4e5a24c80e66\") " pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.276555 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-nt7xg\" (UID: \"9514b639-882b-4ae4-ad29-4e5a24c80e66\") " pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.276588 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-nt7xg\" (UID: \"9514b639-882b-4ae4-ad29-4e5a24c80e66\") " pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.276671 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-nt7xg\" (UID: \"9514b639-882b-4ae4-ad29-4e5a24c80e66\") " pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.276744 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8t2j\" (UniqueName: \"kubernetes.io/projected/9514b639-882b-4ae4-ad29-4e5a24c80e66-kube-api-access-h8t2j\") pod \"dnsmasq-dns-5c79d794d7-nt7xg\" (UID: \"9514b639-882b-4ae4-ad29-4e5a24c80e66\") " pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.276775 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-nt7xg\" (UID: \"9514b639-882b-4ae4-ad29-4e5a24c80e66\") " pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.296699 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-pbhp6" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.389752 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-config\") pod \"dnsmasq-dns-5c79d794d7-nt7xg\" (UID: \"9514b639-882b-4ae4-ad29-4e5a24c80e66\") " pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.390565 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-config\") pod \"dnsmasq-dns-5c79d794d7-nt7xg\" (UID: \"9514b639-882b-4ae4-ad29-4e5a24c80e66\") " pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.390926 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-nt7xg\" (UID: \"9514b639-882b-4ae4-ad29-4e5a24c80e66\") " pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.392114 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-nt7xg\" (UID: \"9514b639-882b-4ae4-ad29-4e5a24c80e66\") " pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.392852 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-nt7xg\" (UID: \"9514b639-882b-4ae4-ad29-4e5a24c80e66\") " pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.393123 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-nt7xg\" (UID: \"9514b639-882b-4ae4-ad29-4e5a24c80e66\") " pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.393755 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-nt7xg\" (UID: \"9514b639-882b-4ae4-ad29-4e5a24c80e66\") " pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.393160 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8t2j\" (UniqueName: \"kubernetes.io/projected/9514b639-882b-4ae4-ad29-4e5a24c80e66-kube-api-access-h8t2j\") pod \"dnsmasq-dns-5c79d794d7-nt7xg\" (UID: \"9514b639-882b-4ae4-ad29-4e5a24c80e66\") " pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.393831 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-nt7xg\" (UID: \"9514b639-882b-4ae4-ad29-4e5a24c80e66\") " pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.394426 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-nt7xg\" (UID: \"9514b639-882b-4ae4-ad29-4e5a24c80e66\") " pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.397089 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-nt7xg\" (UID: \"9514b639-882b-4ae4-ad29-4e5a24c80e66\") " pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.413452 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8t2j\" (UniqueName: \"kubernetes.io/projected/9514b639-882b-4ae4-ad29-4e5a24c80e66-kube-api-access-h8t2j\") pod \"dnsmasq-dns-5c79d794d7-nt7xg\" (UID: \"9514b639-882b-4ae4-ad29-4e5a24c80e66\") " pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.494882 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22b3c462-44a9-4899-a39b-463eda7dd5d0-operator-scripts\") pod \"22b3c462-44a9-4899-a39b-463eda7dd5d0\" (UID: \"22b3c462-44a9-4899-a39b-463eda7dd5d0\") " Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.494928 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8q94\" (UniqueName: \"kubernetes.io/projected/22b3c462-44a9-4899-a39b-463eda7dd5d0-kube-api-access-p8q94\") pod \"22b3c462-44a9-4899-a39b-463eda7dd5d0\" (UID: \"22b3c462-44a9-4899-a39b-463eda7dd5d0\") " Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.496104 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22b3c462-44a9-4899-a39b-463eda7dd5d0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "22b3c462-44a9-4899-a39b-463eda7dd5d0" (UID: "22b3c462-44a9-4899-a39b-463eda7dd5d0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.498398 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22b3c462-44a9-4899-a39b-463eda7dd5d0-kube-api-access-p8q94" (OuterVolumeSpecName: "kube-api-access-p8q94") pod "22b3c462-44a9-4899-a39b-463eda7dd5d0" (UID: "22b3c462-44a9-4899-a39b-463eda7dd5d0"). InnerVolumeSpecName "kube-api-access-p8q94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.537714 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-97fc-account-create-update-xgqkx" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.546653 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-z72n4" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.558764 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0180-account-create-update-dk749" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.563327 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-12ff-account-create-update-mqz6l" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.589057 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.596974 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c-operator-scripts\") pod \"7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c\" (UID: \"7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c\") " Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.597024 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb0c55b7-1488-4ae0-8d27-d063761edde5-operator-scripts\") pod \"cb0c55b7-1488-4ae0-8d27-d063761edde5\" (UID: \"cb0c55b7-1488-4ae0-8d27-d063761edde5\") " Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.597068 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcx98\" (UniqueName: \"kubernetes.io/projected/0788afc4-c079-463b-8d56-3a6be70dbf51-kube-api-access-jcx98\") pod \"0788afc4-c079-463b-8d56-3a6be70dbf51\" (UID: \"0788afc4-c079-463b-8d56-3a6be70dbf51\") " Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.597114 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0788afc4-c079-463b-8d56-3a6be70dbf51-operator-scripts\") pod \"0788afc4-c079-463b-8d56-3a6be70dbf51\" (UID: \"0788afc4-c079-463b-8d56-3a6be70dbf51\") " Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.597149 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbqfw\" (UniqueName: \"kubernetes.io/projected/7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c-kube-api-access-zbqfw\") pod \"7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c\" (UID: \"7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c\") " Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.597177 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4v9z\" (UniqueName: \"kubernetes.io/projected/cb0c55b7-1488-4ae0-8d27-d063761edde5-kube-api-access-j4v9z\") pod \"cb0c55b7-1488-4ae0-8d27-d063761edde5\" (UID: \"cb0c55b7-1488-4ae0-8d27-d063761edde5\") " Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.597218 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kts9w\" (UniqueName: \"kubernetes.io/projected/370a3d71-3446-40b6-80ed-7efdb61a36a5-kube-api-access-kts9w\") pod \"370a3d71-3446-40b6-80ed-7efdb61a36a5\" (UID: \"370a3d71-3446-40b6-80ed-7efdb61a36a5\") " Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.597248 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/370a3d71-3446-40b6-80ed-7efdb61a36a5-operator-scripts\") pod \"370a3d71-3446-40b6-80ed-7efdb61a36a5\" (UID: \"370a3d71-3446-40b6-80ed-7efdb61a36a5\") " Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.597532 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22b3c462-44a9-4899-a39b-463eda7dd5d0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.597552 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8q94\" (UniqueName: \"kubernetes.io/projected/22b3c462-44a9-4899-a39b-463eda7dd5d0-kube-api-access-p8q94\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.598241 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb0c55b7-1488-4ae0-8d27-d063761edde5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cb0c55b7-1488-4ae0-8d27-d063761edde5" (UID: "cb0c55b7-1488-4ae0-8d27-d063761edde5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.598267 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0788afc4-c079-463b-8d56-3a6be70dbf51-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0788afc4-c079-463b-8d56-3a6be70dbf51" (UID: "0788afc4-c079-463b-8d56-3a6be70dbf51"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.598284 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c" (UID: "7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.598809 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/370a3d71-3446-40b6-80ed-7efdb61a36a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "370a3d71-3446-40b6-80ed-7efdb61a36a5" (UID: "370a3d71-3446-40b6-80ed-7efdb61a36a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.602923 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0788afc4-c079-463b-8d56-3a6be70dbf51-kube-api-access-jcx98" (OuterVolumeSpecName: "kube-api-access-jcx98") pod "0788afc4-c079-463b-8d56-3a6be70dbf51" (UID: "0788afc4-c079-463b-8d56-3a6be70dbf51"). InnerVolumeSpecName "kube-api-access-jcx98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.605755 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb0c55b7-1488-4ae0-8d27-d063761edde5-kube-api-access-j4v9z" (OuterVolumeSpecName: "kube-api-access-j4v9z") pod "cb0c55b7-1488-4ae0-8d27-d063761edde5" (UID: "cb0c55b7-1488-4ae0-8d27-d063761edde5"). InnerVolumeSpecName "kube-api-access-j4v9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.607728 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c-kube-api-access-zbqfw" (OuterVolumeSpecName: "kube-api-access-zbqfw") pod "7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c" (UID: "7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c"). InnerVolumeSpecName "kube-api-access-zbqfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.607777 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/370a3d71-3446-40b6-80ed-7efdb61a36a5-kube-api-access-kts9w" (OuterVolumeSpecName: "kube-api-access-kts9w") pod "370a3d71-3446-40b6-80ed-7efdb61a36a5" (UID: "370a3d71-3446-40b6-80ed-7efdb61a36a5"). InnerVolumeSpecName "kube-api-access-kts9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.700310 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kts9w\" (UniqueName: \"kubernetes.io/projected/370a3d71-3446-40b6-80ed-7efdb61a36a5-kube-api-access-kts9w\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.700736 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/370a3d71-3446-40b6-80ed-7efdb61a36a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.700747 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.700757 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb0c55b7-1488-4ae0-8d27-d063761edde5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.700768 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcx98\" (UniqueName: \"kubernetes.io/projected/0788afc4-c079-463b-8d56-3a6be70dbf51-kube-api-access-jcx98\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.700777 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0788afc4-c079-463b-8d56-3a6be70dbf51-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.700787 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbqfw\" (UniqueName: \"kubernetes.io/projected/7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c-kube-api-access-zbqfw\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.700797 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4v9z\" (UniqueName: \"kubernetes.io/projected/cb0c55b7-1488-4ae0-8d27-d063761edde5-kube-api-access-j4v9z\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.715192 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-z72n4" event={"ID":"7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c","Type":"ContainerDied","Data":"977a9a7515b92b19bd68c027f9594583eee579cfd61eef7703e0a7f5d23197a0"} Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.715231 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="977a9a7515b92b19bd68c027f9594583eee579cfd61eef7703e0a7f5d23197a0" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.715301 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-z72n4" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.720401 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-12ff-account-create-update-mqz6l" event={"ID":"370a3d71-3446-40b6-80ed-7efdb61a36a5","Type":"ContainerDied","Data":"c34fb0267240a68d0c17860964e2490f31bdafd48f85299590db79064af8401a"} Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.720459 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-12ff-account-create-update-mqz6l" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.720467 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c34fb0267240a68d0c17860964e2490f31bdafd48f85299590db79064af8401a" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.722610 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-97fc-account-create-update-xgqkx" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.722592 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-97fc-account-create-update-xgqkx" event={"ID":"0788afc4-c079-463b-8d56-3a6be70dbf51","Type":"ContainerDied","Data":"823596f77dbd226ddaf3eb2ecfa26f27e6c92405bdf24d6c4844cdc6fb4a31b4"} Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.722649 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="823596f77dbd226ddaf3eb2ecfa26f27e6c92405bdf24d6c4844cdc6fb4a31b4" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.732831 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-pbhp6" event={"ID":"22b3c462-44a9-4899-a39b-463eda7dd5d0","Type":"ContainerDied","Data":"a6d31607a0c6da807f995a56d5073d72ae0b3ff60b775bbf30e6d136e2ce5664"} Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.732867 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6d31607a0c6da807f995a56d5073d72ae0b3ff60b775bbf30e6d136e2ce5664" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.732925 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-pbhp6" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.737349 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0180-account-create-update-dk749" event={"ID":"cb0c55b7-1488-4ae0-8d27-d063761edde5","Type":"ContainerDied","Data":"b066cd1c9301e5ed50383124038d0fbde9c233d2d810d165e73f47167e6f861f"} Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.737384 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b066cd1c9301e5ed50383124038d0fbde9c233d2d810d165e73f47167e6f861f" Mar 18 15:56:38 crc kubenswrapper[4696]: I0318 15:56:38.737908 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0180-account-create-update-dk749" Mar 18 15:56:39 crc kubenswrapper[4696]: I0318 15:56:39.066366 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-nt7xg"] Mar 18 15:56:41 crc kubenswrapper[4696]: W0318 15:56:41.369709 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9514b639_882b_4ae4_ad29_4e5a24c80e66.slice/crio-325671e71d195ebf604692062c068c055258f41987f2b128430ee6e3dfe37564 WatchSource:0}: Error finding container 325671e71d195ebf604692062c068c055258f41987f2b128430ee6e3dfe37564: Status 404 returned error can't find the container with id 325671e71d195ebf604692062c068c055258f41987f2b128430ee6e3dfe37564 Mar 18 15:56:41 crc kubenswrapper[4696]: I0318 15:56:41.565558 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-twrd5" Mar 18 15:56:41 crc kubenswrapper[4696]: I0318 15:56:41.666985 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b834ff85-81d1-4a20-9f59-0790a7492dfc-operator-scripts\") pod \"b834ff85-81d1-4a20-9f59-0790a7492dfc\" (UID: \"b834ff85-81d1-4a20-9f59-0790a7492dfc\") " Mar 18 15:56:41 crc kubenswrapper[4696]: I0318 15:56:41.667068 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gqqz\" (UniqueName: \"kubernetes.io/projected/b834ff85-81d1-4a20-9f59-0790a7492dfc-kube-api-access-8gqqz\") pod \"b834ff85-81d1-4a20-9f59-0790a7492dfc\" (UID: \"b834ff85-81d1-4a20-9f59-0790a7492dfc\") " Mar 18 15:56:41 crc kubenswrapper[4696]: I0318 15:56:41.667819 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b834ff85-81d1-4a20-9f59-0790a7492dfc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b834ff85-81d1-4a20-9f59-0790a7492dfc" (UID: "b834ff85-81d1-4a20-9f59-0790a7492dfc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:41 crc kubenswrapper[4696]: I0318 15:56:41.670044 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b834ff85-81d1-4a20-9f59-0790a7492dfc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:41 crc kubenswrapper[4696]: I0318 15:56:41.674784 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b834ff85-81d1-4a20-9f59-0790a7492dfc-kube-api-access-8gqqz" (OuterVolumeSpecName: "kube-api-access-8gqqz") pod "b834ff85-81d1-4a20-9f59-0790a7492dfc" (UID: "b834ff85-81d1-4a20-9f59-0790a7492dfc"). InnerVolumeSpecName "kube-api-access-8gqqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:41 crc kubenswrapper[4696]: I0318 15:56:41.771010 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gqqz\" (UniqueName: \"kubernetes.io/projected/b834ff85-81d1-4a20-9f59-0790a7492dfc-kube-api-access-8gqqz\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:41 crc kubenswrapper[4696]: I0318 15:56:41.775287 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-twrd5" event={"ID":"b834ff85-81d1-4a20-9f59-0790a7492dfc","Type":"ContainerDied","Data":"3c89bbcf7636acf00982f7bb6949315a708bf2219b5f95b3d3aaab4d3743ef6b"} Mar 18 15:56:41 crc kubenswrapper[4696]: I0318 15:56:41.775323 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c89bbcf7636acf00982f7bb6949315a708bf2219b5f95b3d3aaab4d3743ef6b" Mar 18 15:56:41 crc kubenswrapper[4696]: I0318 15:56:41.775327 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-twrd5" Mar 18 15:56:41 crc kubenswrapper[4696]: I0318 15:56:41.777530 4696 generic.go:334] "Generic (PLEG): container finished" podID="9514b639-882b-4ae4-ad29-4e5a24c80e66" containerID="3317a11522ef86cbef57116cf30883aefc4e183bf21174e92c3253650c7a7558" exitCode=0 Mar 18 15:56:41 crc kubenswrapper[4696]: I0318 15:56:41.777581 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" event={"ID":"9514b639-882b-4ae4-ad29-4e5a24c80e66","Type":"ContainerDied","Data":"3317a11522ef86cbef57116cf30883aefc4e183bf21174e92c3253650c7a7558"} Mar 18 15:56:41 crc kubenswrapper[4696]: I0318 15:56:41.777599 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" event={"ID":"9514b639-882b-4ae4-ad29-4e5a24c80e66","Type":"ContainerStarted","Data":"325671e71d195ebf604692062c068c055258f41987f2b128430ee6e3dfe37564"} Mar 18 15:56:41 crc kubenswrapper[4696]: I0318 15:56:41.788429 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9jmt6" event={"ID":"c6c5159e-3018-4c1b-8f1c-b40e157d043b","Type":"ContainerStarted","Data":"baa59b49b2d04c27eac13aa0671f98810ecefa397af8a9a414f1ca6baa5a8367"} Mar 18 15:56:41 crc kubenswrapper[4696]: I0318 15:56:41.832108 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-9jmt6" podStartSLOduration=2.257434344 podStartE2EDuration="7.832090797s" podCreationTimestamp="2026-03-18 15:56:34 +0000 UTC" firstStartedPulling="2026-03-18 15:56:35.865806621 +0000 UTC m=+1238.871980827" lastFinishedPulling="2026-03-18 15:56:41.440463074 +0000 UTC m=+1244.446637280" observedRunningTime="2026-03-18 15:56:41.825551323 +0000 UTC m=+1244.831725529" watchObservedRunningTime="2026-03-18 15:56:41.832090797 +0000 UTC m=+1244.838264993" Mar 18 15:56:42 crc kubenswrapper[4696]: I0318 15:56:42.806562 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" event={"ID":"9514b639-882b-4ae4-ad29-4e5a24c80e66","Type":"ContainerStarted","Data":"ce2e135445a0486a4a8f8010f551747d59594dccc07c2f88d118bbdd6f9ac5a4"} Mar 18 15:56:42 crc kubenswrapper[4696]: I0318 15:56:42.807168 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" Mar 18 15:56:42 crc kubenswrapper[4696]: I0318 15:56:42.837145 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" podStartSLOduration=4.83711702 podStartE2EDuration="4.83711702s" podCreationTimestamp="2026-03-18 15:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:56:42.827588191 +0000 UTC m=+1245.833762407" watchObservedRunningTime="2026-03-18 15:56:42.83711702 +0000 UTC m=+1245.843291236" Mar 18 15:56:44 crc kubenswrapper[4696]: I0318 15:56:44.839510 4696 generic.go:334] "Generic (PLEG): container finished" podID="c6c5159e-3018-4c1b-8f1c-b40e157d043b" containerID="baa59b49b2d04c27eac13aa0671f98810ecefa397af8a9a414f1ca6baa5a8367" exitCode=0 Mar 18 15:56:44 crc kubenswrapper[4696]: I0318 15:56:44.839825 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9jmt6" event={"ID":"c6c5159e-3018-4c1b-8f1c-b40e157d043b","Type":"ContainerDied","Data":"baa59b49b2d04c27eac13aa0671f98810ecefa397af8a9a414f1ca6baa5a8367"} Mar 18 15:56:44 crc kubenswrapper[4696]: I0318 15:56:44.845792 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sfb9t" event={"ID":"6cfc5851-8295-4c4d-8cb4-4c18f9827227","Type":"ContainerStarted","Data":"b99d4f81a0eb301f1bf9d22f4abef1b7e592e7d87e5c4dd546f2037b2f8a9321"} Mar 18 15:56:44 crc kubenswrapper[4696]: I0318 15:56:44.889560 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-sfb9t" podStartSLOduration=2.56204991 podStartE2EDuration="34.889541446s" podCreationTimestamp="2026-03-18 15:56:10 +0000 UTC" firstStartedPulling="2026-03-18 15:56:11.104853467 +0000 UTC m=+1214.111027673" lastFinishedPulling="2026-03-18 15:56:43.432345003 +0000 UTC m=+1246.438519209" observedRunningTime="2026-03-18 15:56:44.884254804 +0000 UTC m=+1247.890429010" watchObservedRunningTime="2026-03-18 15:56:44.889541446 +0000 UTC m=+1247.895715652" Mar 18 15:56:46 crc kubenswrapper[4696]: I0318 15:56:46.193500 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9jmt6" Mar 18 15:56:46 crc kubenswrapper[4696]: I0318 15:56:46.352838 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c5159e-3018-4c1b-8f1c-b40e157d043b-combined-ca-bundle\") pod \"c6c5159e-3018-4c1b-8f1c-b40e157d043b\" (UID: \"c6c5159e-3018-4c1b-8f1c-b40e157d043b\") " Mar 18 15:56:46 crc kubenswrapper[4696]: I0318 15:56:46.353023 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csv29\" (UniqueName: \"kubernetes.io/projected/c6c5159e-3018-4c1b-8f1c-b40e157d043b-kube-api-access-csv29\") pod \"c6c5159e-3018-4c1b-8f1c-b40e157d043b\" (UID: \"c6c5159e-3018-4c1b-8f1c-b40e157d043b\") " Mar 18 15:56:46 crc kubenswrapper[4696]: I0318 15:56:46.353123 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6c5159e-3018-4c1b-8f1c-b40e157d043b-config-data\") pod \"c6c5159e-3018-4c1b-8f1c-b40e157d043b\" (UID: \"c6c5159e-3018-4c1b-8f1c-b40e157d043b\") " Mar 18 15:56:46 crc kubenswrapper[4696]: I0318 15:56:46.361948 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6c5159e-3018-4c1b-8f1c-b40e157d043b-kube-api-access-csv29" (OuterVolumeSpecName: "kube-api-access-csv29") pod "c6c5159e-3018-4c1b-8f1c-b40e157d043b" (UID: "c6c5159e-3018-4c1b-8f1c-b40e157d043b"). InnerVolumeSpecName "kube-api-access-csv29". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:46 crc kubenswrapper[4696]: I0318 15:56:46.394175 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c5159e-3018-4c1b-8f1c-b40e157d043b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6c5159e-3018-4c1b-8f1c-b40e157d043b" (UID: "c6c5159e-3018-4c1b-8f1c-b40e157d043b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:56:46 crc kubenswrapper[4696]: I0318 15:56:46.396694 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c5159e-3018-4c1b-8f1c-b40e157d043b-config-data" (OuterVolumeSpecName: "config-data") pod "c6c5159e-3018-4c1b-8f1c-b40e157d043b" (UID: "c6c5159e-3018-4c1b-8f1c-b40e157d043b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:56:46 crc kubenswrapper[4696]: I0318 15:56:46.455618 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6c5159e-3018-4c1b-8f1c-b40e157d043b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:46 crc kubenswrapper[4696]: I0318 15:56:46.455853 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c5159e-3018-4c1b-8f1c-b40e157d043b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:46 crc kubenswrapper[4696]: I0318 15:56:46.455951 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csv29\" (UniqueName: \"kubernetes.io/projected/c6c5159e-3018-4c1b-8f1c-b40e157d043b-kube-api-access-csv29\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:46 crc kubenswrapper[4696]: I0318 15:56:46.864451 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9jmt6" event={"ID":"c6c5159e-3018-4c1b-8f1c-b40e157d043b","Type":"ContainerDied","Data":"1eefba6279e8ad10c7770a7fc53b0ea9b7ee543e6116641335d1f2e0bd2733c0"} Mar 18 15:56:46 crc kubenswrapper[4696]: I0318 15:56:46.864920 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1eefba6279e8ad10c7770a7fc53b0ea9b7ee543e6116641335d1f2e0bd2733c0" Mar 18 15:56:46 crc kubenswrapper[4696]: I0318 15:56:46.864507 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9jmt6" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.156108 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-5zps8"] Mar 18 15:56:47 crc kubenswrapper[4696]: E0318 15:56:47.156576 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="370a3d71-3446-40b6-80ed-7efdb61a36a5" containerName="mariadb-account-create-update" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.156597 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="370a3d71-3446-40b6-80ed-7efdb61a36a5" containerName="mariadb-account-create-update" Mar 18 15:56:47 crc kubenswrapper[4696]: E0318 15:56:47.156624 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c" containerName="mariadb-database-create" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.156633 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c" containerName="mariadb-database-create" Mar 18 15:56:47 crc kubenswrapper[4696]: E0318 15:56:47.156649 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22b3c462-44a9-4899-a39b-463eda7dd5d0" containerName="mariadb-database-create" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.156656 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="22b3c462-44a9-4899-a39b-463eda7dd5d0" containerName="mariadb-database-create" Mar 18 15:56:47 crc kubenswrapper[4696]: E0318 15:56:47.156669 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c5159e-3018-4c1b-8f1c-b40e157d043b" containerName="keystone-db-sync" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.156676 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c5159e-3018-4c1b-8f1c-b40e157d043b" containerName="keystone-db-sync" Mar 18 15:56:47 crc kubenswrapper[4696]: E0318 15:56:47.156695 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b834ff85-81d1-4a20-9f59-0790a7492dfc" containerName="mariadb-database-create" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.156704 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="b834ff85-81d1-4a20-9f59-0790a7492dfc" containerName="mariadb-database-create" Mar 18 15:56:47 crc kubenswrapper[4696]: E0318 15:56:47.156717 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0788afc4-c079-463b-8d56-3a6be70dbf51" containerName="mariadb-account-create-update" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.156726 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0788afc4-c079-463b-8d56-3a6be70dbf51" containerName="mariadb-account-create-update" Mar 18 15:56:47 crc kubenswrapper[4696]: E0318 15:56:47.156751 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb0c55b7-1488-4ae0-8d27-d063761edde5" containerName="mariadb-account-create-update" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.156758 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb0c55b7-1488-4ae0-8d27-d063761edde5" containerName="mariadb-account-create-update" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.156952 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="370a3d71-3446-40b6-80ed-7efdb61a36a5" containerName="mariadb-account-create-update" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.156972 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb0c55b7-1488-4ae0-8d27-d063761edde5" containerName="mariadb-account-create-update" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.156986 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="22b3c462-44a9-4899-a39b-463eda7dd5d0" containerName="mariadb-database-create" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.156999 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="b834ff85-81d1-4a20-9f59-0790a7492dfc" containerName="mariadb-database-create" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.157021 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c" containerName="mariadb-database-create" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.157036 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6c5159e-3018-4c1b-8f1c-b40e157d043b" containerName="keystone-db-sync" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.157049 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="0788afc4-c079-463b-8d56-3a6be70dbf51" containerName="mariadb-account-create-update" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.157774 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5zps8" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.166362 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.166613 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.166655 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.166880 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pdpjw" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.173176 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-nt7xg"] Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.173333 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.173417 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" podUID="9514b639-882b-4ae4-ad29-4e5a24c80e66" containerName="dnsmasq-dns" containerID="cri-o://ce2e135445a0486a4a8f8010f551747d59594dccc07c2f88d118bbdd6f9ac5a4" gracePeriod=10 Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.177916 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.208669 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5zps8"] Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.235498 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b868669f-sj6w5"] Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.244441 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-sj6w5" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.266779 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-sj6w5"] Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.276907 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmqhh\" (UniqueName: \"kubernetes.io/projected/9eaf31d0-9c9f-437d-86b8-c2372266a25e-kube-api-access-dmqhh\") pod \"keystone-bootstrap-5zps8\" (UID: \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\") " pod="openstack/keystone-bootstrap-5zps8" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.276961 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-credential-keys\") pod \"keystone-bootstrap-5zps8\" (UID: \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\") " pod="openstack/keystone-bootstrap-5zps8" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.277007 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-fernet-keys\") pod \"keystone-bootstrap-5zps8\" (UID: \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\") " pod="openstack/keystone-bootstrap-5zps8" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.277063 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-config-data\") pod \"keystone-bootstrap-5zps8\" (UID: \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\") " pod="openstack/keystone-bootstrap-5zps8" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.277092 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-scripts\") pod \"keystone-bootstrap-5zps8\" (UID: \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\") " pod="openstack/keystone-bootstrap-5zps8" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.277121 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-combined-ca-bundle\") pod \"keystone-bootstrap-5zps8\" (UID: \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\") " pod="openstack/keystone-bootstrap-5zps8" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.378477 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmqhh\" (UniqueName: \"kubernetes.io/projected/9eaf31d0-9c9f-437d-86b8-c2372266a25e-kube-api-access-dmqhh\") pod \"keystone-bootstrap-5zps8\" (UID: \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\") " pod="openstack/keystone-bootstrap-5zps8" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.378540 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-credential-keys\") pod \"keystone-bootstrap-5zps8\" (UID: \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\") " pod="openstack/keystone-bootstrap-5zps8" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.378591 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-fernet-keys\") pod \"keystone-bootstrap-5zps8\" (UID: \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\") " pod="openstack/keystone-bootstrap-5zps8" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.378617 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-config\") pod \"dnsmasq-dns-5b868669f-sj6w5\" (UID: \"ff9ac060-ff36-40a0-802d-80559e70a6ae\") " pod="openstack/dnsmasq-dns-5b868669f-sj6w5" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.378650 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n899l\" (UniqueName: \"kubernetes.io/projected/ff9ac060-ff36-40a0-802d-80559e70a6ae-kube-api-access-n899l\") pod \"dnsmasq-dns-5b868669f-sj6w5\" (UID: \"ff9ac060-ff36-40a0-802d-80559e70a6ae\") " pod="openstack/dnsmasq-dns-5b868669f-sj6w5" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.378700 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-config-data\") pod \"keystone-bootstrap-5zps8\" (UID: \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\") " pod="openstack/keystone-bootstrap-5zps8" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.378729 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-scripts\") pod \"keystone-bootstrap-5zps8\" (UID: \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\") " pod="openstack/keystone-bootstrap-5zps8" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.378746 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-sj6w5\" (UID: \"ff9ac060-ff36-40a0-802d-80559e70a6ae\") " pod="openstack/dnsmasq-dns-5b868669f-sj6w5" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.378771 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-sj6w5\" (UID: \"ff9ac060-ff36-40a0-802d-80559e70a6ae\") " pod="openstack/dnsmasq-dns-5b868669f-sj6w5" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.378794 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-sj6w5\" (UID: \"ff9ac060-ff36-40a0-802d-80559e70a6ae\") " pod="openstack/dnsmasq-dns-5b868669f-sj6w5" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.378811 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-dns-svc\") pod \"dnsmasq-dns-5b868669f-sj6w5\" (UID: \"ff9ac060-ff36-40a0-802d-80559e70a6ae\") " pod="openstack/dnsmasq-dns-5b868669f-sj6w5" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.378833 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-combined-ca-bundle\") pod \"keystone-bootstrap-5zps8\" (UID: \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\") " pod="openstack/keystone-bootstrap-5zps8" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.390662 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-config-data\") pod \"keystone-bootstrap-5zps8\" (UID: \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\") " pod="openstack/keystone-bootstrap-5zps8" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.390798 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-v2tzn"] Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.393094 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-v2tzn" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.393704 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-combined-ca-bundle\") pod \"keystone-bootstrap-5zps8\" (UID: \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\") " pod="openstack/keystone-bootstrap-5zps8" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.400427 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6c77bcccc7-9tt7j"] Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.409726 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c77bcccc7-9tt7j" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.415025 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.415243 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.415356 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-rp4hm" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.416015 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-fernet-keys\") pod \"keystone-bootstrap-5zps8\" (UID: \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\") " pod="openstack/keystone-bootstrap-5zps8" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.425164 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.425431 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.425592 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.426407 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-gvbkx" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.426685 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-credential-keys\") pod \"keystone-bootstrap-5zps8\" (UID: \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\") " pod="openstack/keystone-bootstrap-5zps8" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.445909 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-scripts\") pod \"keystone-bootstrap-5zps8\" (UID: \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\") " pod="openstack/keystone-bootstrap-5zps8" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.455462 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-v2tzn"] Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.461364 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmqhh\" (UniqueName: \"kubernetes.io/projected/9eaf31d0-9c9f-437d-86b8-c2372266a25e-kube-api-access-dmqhh\") pod \"keystone-bootstrap-5zps8\" (UID: \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\") " pod="openstack/keystone-bootstrap-5zps8" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.461449 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c77bcccc7-9tt7j"] Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.491768 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5zps8" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.493683 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-config\") pod \"dnsmasq-dns-5b868669f-sj6w5\" (UID: \"ff9ac060-ff36-40a0-802d-80559e70a6ae\") " pod="openstack/dnsmasq-dns-5b868669f-sj6w5" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.493776 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n899l\" (UniqueName: \"kubernetes.io/projected/ff9ac060-ff36-40a0-802d-80559e70a6ae-kube-api-access-n899l\") pod \"dnsmasq-dns-5b868669f-sj6w5\" (UID: \"ff9ac060-ff36-40a0-802d-80559e70a6ae\") " pod="openstack/dnsmasq-dns-5b868669f-sj6w5" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.493830 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-sj6w5\" (UID: \"ff9ac060-ff36-40a0-802d-80559e70a6ae\") " pod="openstack/dnsmasq-dns-5b868669f-sj6w5" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.493865 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-sj6w5\" (UID: \"ff9ac060-ff36-40a0-802d-80559e70a6ae\") " pod="openstack/dnsmasq-dns-5b868669f-sj6w5" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.493894 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-sj6w5\" (UID: \"ff9ac060-ff36-40a0-802d-80559e70a6ae\") " pod="openstack/dnsmasq-dns-5b868669f-sj6w5" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.493921 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-dns-svc\") pod \"dnsmasq-dns-5b868669f-sj6w5\" (UID: \"ff9ac060-ff36-40a0-802d-80559e70a6ae\") " pod="openstack/dnsmasq-dns-5b868669f-sj6w5" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.495053 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-dns-svc\") pod \"dnsmasq-dns-5b868669f-sj6w5\" (UID: \"ff9ac060-ff36-40a0-802d-80559e70a6ae\") " pod="openstack/dnsmasq-dns-5b868669f-sj6w5" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.509740 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-sj6w5\" (UID: \"ff9ac060-ff36-40a0-802d-80559e70a6ae\") " pod="openstack/dnsmasq-dns-5b868669f-sj6w5" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.512581 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-sj6w5\" (UID: \"ff9ac060-ff36-40a0-802d-80559e70a6ae\") " pod="openstack/dnsmasq-dns-5b868669f-sj6w5" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.518896 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-config\") pod \"dnsmasq-dns-5b868669f-sj6w5\" (UID: \"ff9ac060-ff36-40a0-802d-80559e70a6ae\") " pod="openstack/dnsmasq-dns-5b868669f-sj6w5" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.521555 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-sj6w5\" (UID: \"ff9ac060-ff36-40a0-802d-80559e70a6ae\") " pod="openstack/dnsmasq-dns-5b868669f-sj6w5" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.566772 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n899l\" (UniqueName: \"kubernetes.io/projected/ff9ac060-ff36-40a0-802d-80559e70a6ae-kube-api-access-n899l\") pod \"dnsmasq-dns-5b868669f-sj6w5\" (UID: \"ff9ac060-ff36-40a0-802d-80559e70a6ae\") " pod="openstack/dnsmasq-dns-5b868669f-sj6w5" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.577856 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-sj6w5" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.597933 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a76866d-35bf-4dee-8fc4-a5c018e9edce-combined-ca-bundle\") pod \"cinder-db-sync-v2tzn\" (UID: \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\") " pod="openstack/cinder-db-sync-v2tzn" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.598002 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6531566d-5085-46df-8412-2285f9f04c19-logs\") pod \"horizon-6c77bcccc7-9tt7j\" (UID: \"6531566d-5085-46df-8412-2285f9f04c19\") " pod="openstack/horizon-6c77bcccc7-9tt7j" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.598027 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v4g7\" (UniqueName: \"kubernetes.io/projected/6531566d-5085-46df-8412-2285f9f04c19-kube-api-access-5v4g7\") pod \"horizon-6c77bcccc7-9tt7j\" (UID: \"6531566d-5085-46df-8412-2285f9f04c19\") " pod="openstack/horizon-6c77bcccc7-9tt7j" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.598068 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a76866d-35bf-4dee-8fc4-a5c018e9edce-etc-machine-id\") pod \"cinder-db-sync-v2tzn\" (UID: \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\") " pod="openstack/cinder-db-sync-v2tzn" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.598161 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kwjw\" (UniqueName: \"kubernetes.io/projected/4a76866d-35bf-4dee-8fc4-a5c018e9edce-kube-api-access-5kwjw\") pod \"cinder-db-sync-v2tzn\" (UID: \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\") " pod="openstack/cinder-db-sync-v2tzn" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.598221 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6531566d-5085-46df-8412-2285f9f04c19-horizon-secret-key\") pod \"horizon-6c77bcccc7-9tt7j\" (UID: \"6531566d-5085-46df-8412-2285f9f04c19\") " pod="openstack/horizon-6c77bcccc7-9tt7j" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.598245 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6531566d-5085-46df-8412-2285f9f04c19-scripts\") pod \"horizon-6c77bcccc7-9tt7j\" (UID: \"6531566d-5085-46df-8412-2285f9f04c19\") " pod="openstack/horizon-6c77bcccc7-9tt7j" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.598285 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a76866d-35bf-4dee-8fc4-a5c018e9edce-db-sync-config-data\") pod \"cinder-db-sync-v2tzn\" (UID: \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\") " pod="openstack/cinder-db-sync-v2tzn" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.598323 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6531566d-5085-46df-8412-2285f9f04c19-config-data\") pod \"horizon-6c77bcccc7-9tt7j\" (UID: \"6531566d-5085-46df-8412-2285f9f04c19\") " pod="openstack/horizon-6c77bcccc7-9tt7j" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.598351 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a76866d-35bf-4dee-8fc4-a5c018e9edce-scripts\") pod \"cinder-db-sync-v2tzn\" (UID: \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\") " pod="openstack/cinder-db-sync-v2tzn" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.598372 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a76866d-35bf-4dee-8fc4-a5c018e9edce-config-data\") pod \"cinder-db-sync-v2tzn\" (UID: \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\") " pod="openstack/cinder-db-sync-v2tzn" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.700344 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-zfkzp"] Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.700742 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6531566d-5085-46df-8412-2285f9f04c19-scripts\") pod \"horizon-6c77bcccc7-9tt7j\" (UID: \"6531566d-5085-46df-8412-2285f9f04c19\") " pod="openstack/horizon-6c77bcccc7-9tt7j" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.700808 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a76866d-35bf-4dee-8fc4-a5c018e9edce-db-sync-config-data\") pod \"cinder-db-sync-v2tzn\" (UID: \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\") " pod="openstack/cinder-db-sync-v2tzn" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.700848 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6531566d-5085-46df-8412-2285f9f04c19-config-data\") pod \"horizon-6c77bcccc7-9tt7j\" (UID: \"6531566d-5085-46df-8412-2285f9f04c19\") " pod="openstack/horizon-6c77bcccc7-9tt7j" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.700872 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a76866d-35bf-4dee-8fc4-a5c018e9edce-scripts\") pod \"cinder-db-sync-v2tzn\" (UID: \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\") " pod="openstack/cinder-db-sync-v2tzn" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.700889 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a76866d-35bf-4dee-8fc4-a5c018e9edce-config-data\") pod \"cinder-db-sync-v2tzn\" (UID: \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\") " pod="openstack/cinder-db-sync-v2tzn" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.700913 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a76866d-35bf-4dee-8fc4-a5c018e9edce-combined-ca-bundle\") pod \"cinder-db-sync-v2tzn\" (UID: \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\") " pod="openstack/cinder-db-sync-v2tzn" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.700939 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6531566d-5085-46df-8412-2285f9f04c19-logs\") pod \"horizon-6c77bcccc7-9tt7j\" (UID: \"6531566d-5085-46df-8412-2285f9f04c19\") " pod="openstack/horizon-6c77bcccc7-9tt7j" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.700968 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v4g7\" (UniqueName: \"kubernetes.io/projected/6531566d-5085-46df-8412-2285f9f04c19-kube-api-access-5v4g7\") pod \"horizon-6c77bcccc7-9tt7j\" (UID: \"6531566d-5085-46df-8412-2285f9f04c19\") " pod="openstack/horizon-6c77bcccc7-9tt7j" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.700986 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a76866d-35bf-4dee-8fc4-a5c018e9edce-etc-machine-id\") pod \"cinder-db-sync-v2tzn\" (UID: \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\") " pod="openstack/cinder-db-sync-v2tzn" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.701033 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kwjw\" (UniqueName: \"kubernetes.io/projected/4a76866d-35bf-4dee-8fc4-a5c018e9edce-kube-api-access-5kwjw\") pod \"cinder-db-sync-v2tzn\" (UID: \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\") " pod="openstack/cinder-db-sync-v2tzn" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.701080 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6531566d-5085-46df-8412-2285f9f04c19-horizon-secret-key\") pod \"horizon-6c77bcccc7-9tt7j\" (UID: \"6531566d-5085-46df-8412-2285f9f04c19\") " pod="openstack/horizon-6c77bcccc7-9tt7j" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.701493 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zfkzp" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.707409 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6531566d-5085-46df-8412-2285f9f04c19-scripts\") pod \"horizon-6c77bcccc7-9tt7j\" (UID: \"6531566d-5085-46df-8412-2285f9f04c19\") " pod="openstack/horizon-6c77bcccc7-9tt7j" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.708004 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6531566d-5085-46df-8412-2285f9f04c19-logs\") pod \"horizon-6c77bcccc7-9tt7j\" (UID: \"6531566d-5085-46df-8412-2285f9f04c19\") " pod="openstack/horizon-6c77bcccc7-9tt7j" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.708607 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a76866d-35bf-4dee-8fc4-a5c018e9edce-etc-machine-id\") pod \"cinder-db-sync-v2tzn\" (UID: \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\") " pod="openstack/cinder-db-sync-v2tzn" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.709469 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6531566d-5085-46df-8412-2285f9f04c19-config-data\") pod \"horizon-6c77bcccc7-9tt7j\" (UID: \"6531566d-5085-46df-8412-2285f9f04c19\") " pod="openstack/horizon-6c77bcccc7-9tt7j" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.716047 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6531566d-5085-46df-8412-2285f9f04c19-horizon-secret-key\") pod \"horizon-6c77bcccc7-9tt7j\" (UID: \"6531566d-5085-46df-8412-2285f9f04c19\") " pod="openstack/horizon-6c77bcccc7-9tt7j" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.716297 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a76866d-35bf-4dee-8fc4-a5c018e9edce-scripts\") pod \"cinder-db-sync-v2tzn\" (UID: \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\") " pod="openstack/cinder-db-sync-v2tzn" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.737453 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.737589 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.738098 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7h2qr" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.744354 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a76866d-35bf-4dee-8fc4-a5c018e9edce-db-sync-config-data\") pod \"cinder-db-sync-v2tzn\" (UID: \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\") " pod="openstack/cinder-db-sync-v2tzn" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.746719 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v4g7\" (UniqueName: \"kubernetes.io/projected/6531566d-5085-46df-8412-2285f9f04c19-kube-api-access-5v4g7\") pod \"horizon-6c77bcccc7-9tt7j\" (UID: \"6531566d-5085-46df-8412-2285f9f04c19\") " pod="openstack/horizon-6c77bcccc7-9tt7j" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.755859 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a76866d-35bf-4dee-8fc4-a5c018e9edce-config-data\") pod \"cinder-db-sync-v2tzn\" (UID: \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\") " pod="openstack/cinder-db-sync-v2tzn" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.755947 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zfkzp"] Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.761654 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a76866d-35bf-4dee-8fc4-a5c018e9edce-combined-ca-bundle\") pod \"cinder-db-sync-v2tzn\" (UID: \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\") " pod="openstack/cinder-db-sync-v2tzn" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.772111 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.779345 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kwjw\" (UniqueName: \"kubernetes.io/projected/4a76866d-35bf-4dee-8fc4-a5c018e9edce-kube-api-access-5kwjw\") pod \"cinder-db-sync-v2tzn\" (UID: \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\") " pod="openstack/cinder-db-sync-v2tzn" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.783210 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.790828 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.803174 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-logs\") pod \"placement-db-sync-zfkzp\" (UID: \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\") " pod="openstack/placement-db-sync-zfkzp" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.803282 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78d2r\" (UniqueName: \"kubernetes.io/projected/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-kube-api-access-78d2r\") pod \"placement-db-sync-zfkzp\" (UID: \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\") " pod="openstack/placement-db-sync-zfkzp" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.803333 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-config-data\") pod \"placement-db-sync-zfkzp\" (UID: \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\") " pod="openstack/placement-db-sync-zfkzp" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.813594 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.813890 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-combined-ca-bundle\") pod \"placement-db-sync-zfkzp\" (UID: \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\") " pod="openstack/placement-db-sync-zfkzp" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.814016 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-scripts\") pod \"placement-db-sync-zfkzp\" (UID: \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\") " pod="openstack/placement-db-sync-zfkzp" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.827252 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.844008 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-v2tzn" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.892469 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c77bcccc7-9tt7j" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.925218 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78d2r\" (UniqueName: \"kubernetes.io/projected/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-kube-api-access-78d2r\") pod \"placement-db-sync-zfkzp\" (UID: \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\") " pod="openstack/placement-db-sync-zfkzp" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.925661 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e295c1a-9787-42a3-ac9c-5252bda652b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " pod="openstack/ceilometer-0" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.925770 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e295c1a-9787-42a3-ac9c-5252bda652b5-run-httpd\") pod \"ceilometer-0\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " pod="openstack/ceilometer-0" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.925867 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-config-data\") pod \"placement-db-sync-zfkzp\" (UID: \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\") " pod="openstack/placement-db-sync-zfkzp" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.926009 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e295c1a-9787-42a3-ac9c-5252bda652b5-scripts\") pod \"ceilometer-0\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " pod="openstack/ceilometer-0" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.926119 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-combined-ca-bundle\") pod \"placement-db-sync-zfkzp\" (UID: \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\") " pod="openstack/placement-db-sync-zfkzp" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.926212 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjzpm\" (UniqueName: \"kubernetes.io/projected/6e295c1a-9787-42a3-ac9c-5252bda652b5-kube-api-access-xjzpm\") pod \"ceilometer-0\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " pod="openstack/ceilometer-0" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.926344 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-scripts\") pod \"placement-db-sync-zfkzp\" (UID: \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\") " pod="openstack/placement-db-sync-zfkzp" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.926436 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e295c1a-9787-42a3-ac9c-5252bda652b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " pod="openstack/ceilometer-0" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.926551 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e295c1a-9787-42a3-ac9c-5252bda652b5-config-data\") pod \"ceilometer-0\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " pod="openstack/ceilometer-0" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.926652 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-logs\") pod \"placement-db-sync-zfkzp\" (UID: \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\") " pod="openstack/placement-db-sync-zfkzp" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.926747 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e295c1a-9787-42a3-ac9c-5252bda652b5-log-httpd\") pod \"ceilometer-0\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " pod="openstack/ceilometer-0" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.928278 4696 generic.go:334] "Generic (PLEG): container finished" podID="9514b639-882b-4ae4-ad29-4e5a24c80e66" containerID="ce2e135445a0486a4a8f8010f551747d59594dccc07c2f88d118bbdd6f9ac5a4" exitCode=0 Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.928461 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-9dbdd94d9-hg2vz"] Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.930176 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" event={"ID":"9514b639-882b-4ae4-ad29-4e5a24c80e66","Type":"ContainerDied","Data":"ce2e135445a0486a4a8f8010f551747d59594dccc07c2f88d118bbdd6f9ac5a4"} Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.932748 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9dbdd94d9-hg2vz" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.934184 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-logs\") pod \"placement-db-sync-zfkzp\" (UID: \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\") " pod="openstack/placement-db-sync-zfkzp" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.937595 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-scripts\") pod \"placement-db-sync-zfkzp\" (UID: \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\") " pod="openstack/placement-db-sync-zfkzp" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.945243 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-config-data\") pod \"placement-db-sync-zfkzp\" (UID: \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\") " pod="openstack/placement-db-sync-zfkzp" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.966002 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9dbdd94d9-hg2vz"] Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.974420 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78d2r\" (UniqueName: \"kubernetes.io/projected/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-kube-api-access-78d2r\") pod \"placement-db-sync-zfkzp\" (UID: \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\") " pod="openstack/placement-db-sync-zfkzp" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.974555 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-combined-ca-bundle\") pod \"placement-db-sync-zfkzp\" (UID: \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\") " pod="openstack/placement-db-sync-zfkzp" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.976242 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-h94vl"] Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.977990 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-h94vl" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.981218 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.981461 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.983948 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-pjn66" Mar 18 15:56:47 crc kubenswrapper[4696]: I0318 15:56:47.992466 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-sj6w5"] Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.024927 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-h94vl"] Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.029188 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd1aba9c-5505-4cee-a201-bc88e5c75f92-config-data\") pod \"horizon-9dbdd94d9-hg2vz\" (UID: \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\") " pod="openstack/horizon-9dbdd94d9-hg2vz" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.029377 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbddk\" (UniqueName: \"kubernetes.io/projected/fd1aba9c-5505-4cee-a201-bc88e5c75f92-kube-api-access-dbddk\") pod \"horizon-9dbdd94d9-hg2vz\" (UID: \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\") " pod="openstack/horizon-9dbdd94d9-hg2vz" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.030386 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e295c1a-9787-42a3-ac9c-5252bda652b5-scripts\") pod \"ceilometer-0\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " pod="openstack/ceilometer-0" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.030925 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e07ce44-d586-4f16-952b-5e1eb3b0cfa6-combined-ca-bundle\") pod \"neutron-db-sync-h94vl\" (UID: \"0e07ce44-d586-4f16-952b-5e1eb3b0cfa6\") " pod="openstack/neutron-db-sync-h94vl" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.031749 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjzpm\" (UniqueName: \"kubernetes.io/projected/6e295c1a-9787-42a3-ac9c-5252bda652b5-kube-api-access-xjzpm\") pod \"ceilometer-0\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " pod="openstack/ceilometer-0" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.031892 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd1aba9c-5505-4cee-a201-bc88e5c75f92-logs\") pod \"horizon-9dbdd94d9-hg2vz\" (UID: \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\") " pod="openstack/horizon-9dbdd94d9-hg2vz" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.031946 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e295c1a-9787-42a3-ac9c-5252bda652b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " pod="openstack/ceilometer-0" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.032693 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e295c1a-9787-42a3-ac9c-5252bda652b5-config-data\") pod \"ceilometer-0\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " pod="openstack/ceilometer-0" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.032790 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e295c1a-9787-42a3-ac9c-5252bda652b5-log-httpd\") pod \"ceilometer-0\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " pod="openstack/ceilometer-0" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.032841 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdlmq\" (UniqueName: \"kubernetes.io/projected/0e07ce44-d586-4f16-952b-5e1eb3b0cfa6-kube-api-access-pdlmq\") pod \"neutron-db-sync-h94vl\" (UID: \"0e07ce44-d586-4f16-952b-5e1eb3b0cfa6\") " pod="openstack/neutron-db-sync-h94vl" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.032986 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd1aba9c-5505-4cee-a201-bc88e5c75f92-scripts\") pod \"horizon-9dbdd94d9-hg2vz\" (UID: \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\") " pod="openstack/horizon-9dbdd94d9-hg2vz" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.033020 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e295c1a-9787-42a3-ac9c-5252bda652b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " pod="openstack/ceilometer-0" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.033057 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e295c1a-9787-42a3-ac9c-5252bda652b5-run-httpd\") pod \"ceilometer-0\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " pod="openstack/ceilometer-0" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.033123 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fd1aba9c-5505-4cee-a201-bc88e5c75f92-horizon-secret-key\") pod \"horizon-9dbdd94d9-hg2vz\" (UID: \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\") " pod="openstack/horizon-9dbdd94d9-hg2vz" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.033156 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e07ce44-d586-4f16-952b-5e1eb3b0cfa6-config\") pod \"neutron-db-sync-h94vl\" (UID: \"0e07ce44-d586-4f16-952b-5e1eb3b0cfa6\") " pod="openstack/neutron-db-sync-h94vl" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.034117 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e295c1a-9787-42a3-ac9c-5252bda652b5-scripts\") pod \"ceilometer-0\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " pod="openstack/ceilometer-0" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.034442 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e295c1a-9787-42a3-ac9c-5252bda652b5-log-httpd\") pod \"ceilometer-0\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " pod="openstack/ceilometer-0" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.034760 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e295c1a-9787-42a3-ac9c-5252bda652b5-run-httpd\") pod \"ceilometer-0\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " pod="openstack/ceilometer-0" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.038510 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e295c1a-9787-42a3-ac9c-5252bda652b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " pod="openstack/ceilometer-0" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.039401 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e295c1a-9787-42a3-ac9c-5252bda652b5-config-data\") pod \"ceilometer-0\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " pod="openstack/ceilometer-0" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.044656 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-fxg9t"] Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.046498 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e295c1a-9787-42a3-ac9c-5252bda652b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " pod="openstack/ceilometer-0" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.046647 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fxg9t" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.050601 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.050908 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-27j6v" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.053605 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjzpm\" (UniqueName: \"kubernetes.io/projected/6e295c1a-9787-42a3-ac9c-5252bda652b5-kube-api-access-xjzpm\") pod \"ceilometer-0\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " pod="openstack/ceilometer-0" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.055265 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-fxg9t"] Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.067688 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-ntvq9"] Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.069861 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.092108 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zfkzp" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.094771 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-ntvq9"] Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.136366 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fd1aba9c-5505-4cee-a201-bc88e5c75f92-horizon-secret-key\") pod \"horizon-9dbdd94d9-hg2vz\" (UID: \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\") " pod="openstack/horizon-9dbdd94d9-hg2vz" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.136414 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e07ce44-d586-4f16-952b-5e1eb3b0cfa6-config\") pod \"neutron-db-sync-h94vl\" (UID: \"0e07ce44-d586-4f16-952b-5e1eb3b0cfa6\") " pod="openstack/neutron-db-sync-h94vl" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.136467 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd1aba9c-5505-4cee-a201-bc88e5c75f92-config-data\") pod \"horizon-9dbdd94d9-hg2vz\" (UID: \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\") " pod="openstack/horizon-9dbdd94d9-hg2vz" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.136498 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbddk\" (UniqueName: \"kubernetes.io/projected/fd1aba9c-5505-4cee-a201-bc88e5c75f92-kube-api-access-dbddk\") pod \"horizon-9dbdd94d9-hg2vz\" (UID: \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\") " pod="openstack/horizon-9dbdd94d9-hg2vz" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.136548 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pst8\" (UniqueName: \"kubernetes.io/projected/81e814fe-6cea-48ae-88e9-00f367333f36-kube-api-access-8pst8\") pod \"dnsmasq-dns-cf78879c9-ntvq9\" (UID: \"81e814fe-6cea-48ae-88e9-00f367333f36\") " pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.136590 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527c444b-3209-4c1e-addb-ed9404ab8efd-combined-ca-bundle\") pod \"barbican-db-sync-fxg9t\" (UID: \"527c444b-3209-4c1e-addb-ed9404ab8efd\") " pod="openstack/barbican-db-sync-fxg9t" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.136645 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-ntvq9\" (UID: \"81e814fe-6cea-48ae-88e9-00f367333f36\") " pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.136678 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e07ce44-d586-4f16-952b-5e1eb3b0cfa6-combined-ca-bundle\") pod \"neutron-db-sync-h94vl\" (UID: \"0e07ce44-d586-4f16-952b-5e1eb3b0cfa6\") " pod="openstack/neutron-db-sync-h94vl" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.136753 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/527c444b-3209-4c1e-addb-ed9404ab8efd-db-sync-config-data\") pod \"barbican-db-sync-fxg9t\" (UID: \"527c444b-3209-4c1e-addb-ed9404ab8efd\") " pod="openstack/barbican-db-sync-fxg9t" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.136778 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-ntvq9\" (UID: \"81e814fe-6cea-48ae-88e9-00f367333f36\") " pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.136804 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-ntvq9\" (UID: \"81e814fe-6cea-48ae-88e9-00f367333f36\") " pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.136843 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd1aba9c-5505-4cee-a201-bc88e5c75f92-logs\") pod \"horizon-9dbdd94d9-hg2vz\" (UID: \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\") " pod="openstack/horizon-9dbdd94d9-hg2vz" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.136928 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-dns-svc\") pod \"dnsmasq-dns-cf78879c9-ntvq9\" (UID: \"81e814fe-6cea-48ae-88e9-00f367333f36\") " pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.137017 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdlmq\" (UniqueName: \"kubernetes.io/projected/0e07ce44-d586-4f16-952b-5e1eb3b0cfa6-kube-api-access-pdlmq\") pod \"neutron-db-sync-h94vl\" (UID: \"0e07ce44-d586-4f16-952b-5e1eb3b0cfa6\") " pod="openstack/neutron-db-sync-h94vl" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.137044 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-config\") pod \"dnsmasq-dns-cf78879c9-ntvq9\" (UID: \"81e814fe-6cea-48ae-88e9-00f367333f36\") " pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.137117 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd1aba9c-5505-4cee-a201-bc88e5c75f92-scripts\") pod \"horizon-9dbdd94d9-hg2vz\" (UID: \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\") " pod="openstack/horizon-9dbdd94d9-hg2vz" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.137148 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4nrh\" (UniqueName: \"kubernetes.io/projected/527c444b-3209-4c1e-addb-ed9404ab8efd-kube-api-access-n4nrh\") pod \"barbican-db-sync-fxg9t\" (UID: \"527c444b-3209-4c1e-addb-ed9404ab8efd\") " pod="openstack/barbican-db-sync-fxg9t" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.140206 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd1aba9c-5505-4cee-a201-bc88e5c75f92-logs\") pod \"horizon-9dbdd94d9-hg2vz\" (UID: \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\") " pod="openstack/horizon-9dbdd94d9-hg2vz" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.142263 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd1aba9c-5505-4cee-a201-bc88e5c75f92-config-data\") pod \"horizon-9dbdd94d9-hg2vz\" (UID: \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\") " pod="openstack/horizon-9dbdd94d9-hg2vz" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.142592 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fd1aba9c-5505-4cee-a201-bc88e5c75f92-horizon-secret-key\") pod \"horizon-9dbdd94d9-hg2vz\" (UID: \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\") " pod="openstack/horizon-9dbdd94d9-hg2vz" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.143099 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd1aba9c-5505-4cee-a201-bc88e5c75f92-scripts\") pod \"horizon-9dbdd94d9-hg2vz\" (UID: \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\") " pod="openstack/horizon-9dbdd94d9-hg2vz" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.144193 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e07ce44-d586-4f16-952b-5e1eb3b0cfa6-config\") pod \"neutron-db-sync-h94vl\" (UID: \"0e07ce44-d586-4f16-952b-5e1eb3b0cfa6\") " pod="openstack/neutron-db-sync-h94vl" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.153457 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.165988 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbddk\" (UniqueName: \"kubernetes.io/projected/fd1aba9c-5505-4cee-a201-bc88e5c75f92-kube-api-access-dbddk\") pod \"horizon-9dbdd94d9-hg2vz\" (UID: \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\") " pod="openstack/horizon-9dbdd94d9-hg2vz" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.166606 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e07ce44-d586-4f16-952b-5e1eb3b0cfa6-combined-ca-bundle\") pod \"neutron-db-sync-h94vl\" (UID: \"0e07ce44-d586-4f16-952b-5e1eb3b0cfa6\") " pod="openstack/neutron-db-sync-h94vl" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.170087 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdlmq\" (UniqueName: \"kubernetes.io/projected/0e07ce44-d586-4f16-952b-5e1eb3b0cfa6-kube-api-access-pdlmq\") pod \"neutron-db-sync-h94vl\" (UID: \"0e07ce44-d586-4f16-952b-5e1eb3b0cfa6\") " pod="openstack/neutron-db-sync-h94vl" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.239547 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4nrh\" (UniqueName: \"kubernetes.io/projected/527c444b-3209-4c1e-addb-ed9404ab8efd-kube-api-access-n4nrh\") pod \"barbican-db-sync-fxg9t\" (UID: \"527c444b-3209-4c1e-addb-ed9404ab8efd\") " pod="openstack/barbican-db-sync-fxg9t" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.239636 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pst8\" (UniqueName: \"kubernetes.io/projected/81e814fe-6cea-48ae-88e9-00f367333f36-kube-api-access-8pst8\") pod \"dnsmasq-dns-cf78879c9-ntvq9\" (UID: \"81e814fe-6cea-48ae-88e9-00f367333f36\") " pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.239668 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527c444b-3209-4c1e-addb-ed9404ab8efd-combined-ca-bundle\") pod \"barbican-db-sync-fxg9t\" (UID: \"527c444b-3209-4c1e-addb-ed9404ab8efd\") " pod="openstack/barbican-db-sync-fxg9t" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.239688 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-ntvq9\" (UID: \"81e814fe-6cea-48ae-88e9-00f367333f36\") " pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.239732 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/527c444b-3209-4c1e-addb-ed9404ab8efd-db-sync-config-data\") pod \"barbican-db-sync-fxg9t\" (UID: \"527c444b-3209-4c1e-addb-ed9404ab8efd\") " pod="openstack/barbican-db-sync-fxg9t" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.239747 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-ntvq9\" (UID: \"81e814fe-6cea-48ae-88e9-00f367333f36\") " pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.239764 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-ntvq9\" (UID: \"81e814fe-6cea-48ae-88e9-00f367333f36\") " pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.239807 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-dns-svc\") pod \"dnsmasq-dns-cf78879c9-ntvq9\" (UID: \"81e814fe-6cea-48ae-88e9-00f367333f36\") " pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.239847 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-config\") pod \"dnsmasq-dns-cf78879c9-ntvq9\" (UID: \"81e814fe-6cea-48ae-88e9-00f367333f36\") " pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.241337 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-config\") pod \"dnsmasq-dns-cf78879c9-ntvq9\" (UID: \"81e814fe-6cea-48ae-88e9-00f367333f36\") " pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.241621 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-ntvq9\" (UID: \"81e814fe-6cea-48ae-88e9-00f367333f36\") " pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.242200 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-dns-svc\") pod \"dnsmasq-dns-cf78879c9-ntvq9\" (UID: \"81e814fe-6cea-48ae-88e9-00f367333f36\") " pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.242257 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-ntvq9\" (UID: \"81e814fe-6cea-48ae-88e9-00f367333f36\") " pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.242800 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-ntvq9\" (UID: \"81e814fe-6cea-48ae-88e9-00f367333f36\") " pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.260156 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/527c444b-3209-4c1e-addb-ed9404ab8efd-db-sync-config-data\") pod \"barbican-db-sync-fxg9t\" (UID: \"527c444b-3209-4c1e-addb-ed9404ab8efd\") " pod="openstack/barbican-db-sync-fxg9t" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.260392 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527c444b-3209-4c1e-addb-ed9404ab8efd-combined-ca-bundle\") pod \"barbican-db-sync-fxg9t\" (UID: \"527c444b-3209-4c1e-addb-ed9404ab8efd\") " pod="openstack/barbican-db-sync-fxg9t" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.267150 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pst8\" (UniqueName: \"kubernetes.io/projected/81e814fe-6cea-48ae-88e9-00f367333f36-kube-api-access-8pst8\") pod \"dnsmasq-dns-cf78879c9-ntvq9\" (UID: \"81e814fe-6cea-48ae-88e9-00f367333f36\") " pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.283535 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9dbdd94d9-hg2vz" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.284199 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4nrh\" (UniqueName: \"kubernetes.io/projected/527c444b-3209-4c1e-addb-ed9404ab8efd-kube-api-access-n4nrh\") pod \"barbican-db-sync-fxg9t\" (UID: \"527c444b-3209-4c1e-addb-ed9404ab8efd\") " pod="openstack/barbican-db-sync-fxg9t" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.299454 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-h94vl" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.313007 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5zps8"] Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.379352 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fxg9t" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.407068 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.615968 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.739531 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-sj6w5"] Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.766104 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-ovsdbserver-sb\") pod \"9514b639-882b-4ae4-ad29-4e5a24c80e66\" (UID: \"9514b639-882b-4ae4-ad29-4e5a24c80e66\") " Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.766147 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-config\") pod \"9514b639-882b-4ae4-ad29-4e5a24c80e66\" (UID: \"9514b639-882b-4ae4-ad29-4e5a24c80e66\") " Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.766220 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-dns-svc\") pod \"9514b639-882b-4ae4-ad29-4e5a24c80e66\" (UID: \"9514b639-882b-4ae4-ad29-4e5a24c80e66\") " Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.766323 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8t2j\" (UniqueName: \"kubernetes.io/projected/9514b639-882b-4ae4-ad29-4e5a24c80e66-kube-api-access-h8t2j\") pod \"9514b639-882b-4ae4-ad29-4e5a24c80e66\" (UID: \"9514b639-882b-4ae4-ad29-4e5a24c80e66\") " Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.766339 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-dns-swift-storage-0\") pod \"9514b639-882b-4ae4-ad29-4e5a24c80e66\" (UID: \"9514b639-882b-4ae4-ad29-4e5a24c80e66\") " Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.766440 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-ovsdbserver-nb\") pod \"9514b639-882b-4ae4-ad29-4e5a24c80e66\" (UID: \"9514b639-882b-4ae4-ad29-4e5a24c80e66\") " Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.796196 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-v2tzn"] Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.834981 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9514b639-882b-4ae4-ad29-4e5a24c80e66-kube-api-access-h8t2j" (OuterVolumeSpecName: "kube-api-access-h8t2j") pod "9514b639-882b-4ae4-ad29-4e5a24c80e66" (UID: "9514b639-882b-4ae4-ad29-4e5a24c80e66"). InnerVolumeSpecName "kube-api-access-h8t2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.872742 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8t2j\" (UniqueName: \"kubernetes.io/projected/9514b639-882b-4ae4-ad29-4e5a24c80e66-kube-api-access-h8t2j\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:48 crc kubenswrapper[4696]: W0318 15:56:48.920961 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a76866d_35bf_4dee_8fc4_a5c018e9edce.slice/crio-bbf6acfd3cbedfeb767c101bda2de64cf09e1d4337d0fb84790ef7b805617731 WatchSource:0}: Error finding container bbf6acfd3cbedfeb767c101bda2de64cf09e1d4337d0fb84790ef7b805617731: Status 404 returned error can't find the container with id bbf6acfd3cbedfeb767c101bda2de64cf09e1d4337d0fb84790ef7b805617731 Mar 18 15:56:48 crc kubenswrapper[4696]: I0318 15:56:48.988628 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zfkzp"] Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.001603 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9514b639-882b-4ae4-ad29-4e5a24c80e66" (UID: "9514b639-882b-4ae4-ad29-4e5a24c80e66"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.018918 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9514b639-882b-4ae4-ad29-4e5a24c80e66" (UID: "9514b639-882b-4ae4-ad29-4e5a24c80e66"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.028179 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c77bcccc7-9tt7j"] Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.034340 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5zps8" event={"ID":"9eaf31d0-9c9f-437d-86b8-c2372266a25e","Type":"ContainerStarted","Data":"acc827e0601d3fb5fbc803c5b8429c51f060b2219f68366a941b73b64f8fc935"} Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.037473 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-v2tzn" event={"ID":"4a76866d-35bf-4dee-8fc4-a5c018e9edce","Type":"ContainerStarted","Data":"bbf6acfd3cbedfeb767c101bda2de64cf09e1d4337d0fb84790ef7b805617731"} Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.045216 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9514b639-882b-4ae4-ad29-4e5a24c80e66" (UID: "9514b639-882b-4ae4-ad29-4e5a24c80e66"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.050229 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.087510 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.087925 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-nt7xg" event={"ID":"9514b639-882b-4ae4-ad29-4e5a24c80e66","Type":"ContainerDied","Data":"325671e71d195ebf604692062c068c055258f41987f2b128430ee6e3dfe37564"} Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.088184 4696 scope.go:117] "RemoveContainer" containerID="ce2e135445a0486a4a8f8010f551747d59594dccc07c2f88d118bbdd6f9ac5a4" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.092542 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-sj6w5" event={"ID":"ff9ac060-ff36-40a0-802d-80559e70a6ae","Type":"ContainerStarted","Data":"6f759cdaaed7a7534d26ff5c71793631125fe322bd6dd04da0959a48a15df8d6"} Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.093112 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9514b639-882b-4ae4-ad29-4e5a24c80e66" (UID: "9514b639-882b-4ae4-ad29-4e5a24c80e66"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.094534 4696 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.094566 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.094577 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.094588 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.119757 4696 scope.go:117] "RemoveContainer" containerID="3317a11522ef86cbef57116cf30883aefc4e183bf21174e92c3253650c7a7558" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.134027 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-config" (OuterVolumeSpecName: "config") pod "9514b639-882b-4ae4-ad29-4e5a24c80e66" (UID: "9514b639-882b-4ae4-ad29-4e5a24c80e66"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.196151 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9514b639-882b-4ae4-ad29-4e5a24c80e66-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.299332 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-h94vl"] Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.497132 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6c77bcccc7-9tt7j"] Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.549363 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7b785567b9-m76ht"] Mar 18 15:56:49 crc kubenswrapper[4696]: E0318 15:56:49.550029 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9514b639-882b-4ae4-ad29-4e5a24c80e66" containerName="init" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.550052 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9514b639-882b-4ae4-ad29-4e5a24c80e66" containerName="init" Mar 18 15:56:49 crc kubenswrapper[4696]: E0318 15:56:49.550074 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9514b639-882b-4ae4-ad29-4e5a24c80e66" containerName="dnsmasq-dns" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.550082 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9514b639-882b-4ae4-ad29-4e5a24c80e66" containerName="dnsmasq-dns" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.550320 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="9514b639-882b-4ae4-ad29-4e5a24c80e66" containerName="dnsmasq-dns" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.551430 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b785567b9-m76ht" Mar 18 15:56:49 crc kubenswrapper[4696]: W0318 15:56:49.570972 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81e814fe_6cea_48ae_88e9_00f367333f36.slice/crio-fd21d4e4fc96e5d650394f785baa2d711150c683f7fa6d2153912f34597b27f2 WatchSource:0}: Error finding container fd21d4e4fc96e5d650394f785baa2d711150c683f7fa6d2153912f34597b27f2: Status 404 returned error can't find the container with id fd21d4e4fc96e5d650394f785baa2d711150c683f7fa6d2153912f34597b27f2 Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.588472 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-ntvq9"] Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.619133 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9dbdd94d9-hg2vz"] Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.627551 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b785567b9-m76ht"] Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.635600 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-nt7xg"] Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.642364 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-nt7xg"] Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.643762 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6drkq\" (UniqueName: \"kubernetes.io/projected/80f279e9-8ebe-4ce6-b59f-727f3fd97678-kube-api-access-6drkq\") pod \"horizon-7b785567b9-m76ht\" (UID: \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\") " pod="openstack/horizon-7b785567b9-m76ht" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.643835 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80f279e9-8ebe-4ce6-b59f-727f3fd97678-logs\") pod \"horizon-7b785567b9-m76ht\" (UID: \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\") " pod="openstack/horizon-7b785567b9-m76ht" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.643904 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80f279e9-8ebe-4ce6-b59f-727f3fd97678-scripts\") pod \"horizon-7b785567b9-m76ht\" (UID: \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\") " pod="openstack/horizon-7b785567b9-m76ht" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.643954 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/80f279e9-8ebe-4ce6-b59f-727f3fd97678-config-data\") pod \"horizon-7b785567b9-m76ht\" (UID: \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\") " pod="openstack/horizon-7b785567b9-m76ht" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.643995 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/80f279e9-8ebe-4ce6-b59f-727f3fd97678-horizon-secret-key\") pod \"horizon-7b785567b9-m76ht\" (UID: \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\") " pod="openstack/horizon-7b785567b9-m76ht" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.688451 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.709793 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-fxg9t"] Mar 18 15:56:49 crc kubenswrapper[4696]: W0318 15:56:49.720004 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod527c444b_3209_4c1e_addb_ed9404ab8efd.slice/crio-e48fe629cc48746936fd4db776af58977aacfe39047a902dc8f828f2f21fc378 WatchSource:0}: Error finding container e48fe629cc48746936fd4db776af58977aacfe39047a902dc8f828f2f21fc378: Status 404 returned error can't find the container with id e48fe629cc48746936fd4db776af58977aacfe39047a902dc8f828f2f21fc378 Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.746150 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/80f279e9-8ebe-4ce6-b59f-727f3fd97678-horizon-secret-key\") pod \"horizon-7b785567b9-m76ht\" (UID: \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\") " pod="openstack/horizon-7b785567b9-m76ht" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.746258 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6drkq\" (UniqueName: \"kubernetes.io/projected/80f279e9-8ebe-4ce6-b59f-727f3fd97678-kube-api-access-6drkq\") pod \"horizon-7b785567b9-m76ht\" (UID: \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\") " pod="openstack/horizon-7b785567b9-m76ht" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.746331 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80f279e9-8ebe-4ce6-b59f-727f3fd97678-logs\") pod \"horizon-7b785567b9-m76ht\" (UID: \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\") " pod="openstack/horizon-7b785567b9-m76ht" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.746381 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80f279e9-8ebe-4ce6-b59f-727f3fd97678-scripts\") pod \"horizon-7b785567b9-m76ht\" (UID: \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\") " pod="openstack/horizon-7b785567b9-m76ht" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.746415 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/80f279e9-8ebe-4ce6-b59f-727f3fd97678-config-data\") pod \"horizon-7b785567b9-m76ht\" (UID: \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\") " pod="openstack/horizon-7b785567b9-m76ht" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.748556 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/80f279e9-8ebe-4ce6-b59f-727f3fd97678-config-data\") pod \"horizon-7b785567b9-m76ht\" (UID: \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\") " pod="openstack/horizon-7b785567b9-m76ht" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.749804 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80f279e9-8ebe-4ce6-b59f-727f3fd97678-logs\") pod \"horizon-7b785567b9-m76ht\" (UID: \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\") " pod="openstack/horizon-7b785567b9-m76ht" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.750228 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80f279e9-8ebe-4ce6-b59f-727f3fd97678-scripts\") pod \"horizon-7b785567b9-m76ht\" (UID: \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\") " pod="openstack/horizon-7b785567b9-m76ht" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.758254 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/80f279e9-8ebe-4ce6-b59f-727f3fd97678-horizon-secret-key\") pod \"horizon-7b785567b9-m76ht\" (UID: \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\") " pod="openstack/horizon-7b785567b9-m76ht" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.784506 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6drkq\" (UniqueName: \"kubernetes.io/projected/80f279e9-8ebe-4ce6-b59f-727f3fd97678-kube-api-access-6drkq\") pod \"horizon-7b785567b9-m76ht\" (UID: \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\") " pod="openstack/horizon-7b785567b9-m76ht" Mar 18 15:56:49 crc kubenswrapper[4696]: I0318 15:56:49.905833 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b785567b9-m76ht" Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.111261 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5zps8" event={"ID":"9eaf31d0-9c9f-437d-86b8-c2372266a25e","Type":"ContainerStarted","Data":"33bbac5754dd862078d8a24c3ef40c131595f1ee86df49baefb6cec43608b237"} Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.114727 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fxg9t" event={"ID":"527c444b-3209-4c1e-addb-ed9404ab8efd","Type":"ContainerStarted","Data":"e48fe629cc48746936fd4db776af58977aacfe39047a902dc8f828f2f21fc378"} Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.119663 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e295c1a-9787-42a3-ac9c-5252bda652b5","Type":"ContainerStarted","Data":"5a19e230e0245bb36d4544140a0c3346de38f6d54e6eff72ba8448086f555bb7"} Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.123398 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zfkzp" event={"ID":"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617","Type":"ContainerStarted","Data":"970b3e113be70b3aafb79b6792088adcbcfc00e95855cff5f0dfa099beacc011"} Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.130254 4696 generic.go:334] "Generic (PLEG): container finished" podID="81e814fe-6cea-48ae-88e9-00f367333f36" containerID="442bdfa16a4909af531e7413cea4a66acea9e514146809940548933da4e96366" exitCode=0 Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.130464 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" event={"ID":"81e814fe-6cea-48ae-88e9-00f367333f36","Type":"ContainerDied","Data":"442bdfa16a4909af531e7413cea4a66acea9e514146809940548933da4e96366"} Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.130676 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" event={"ID":"81e814fe-6cea-48ae-88e9-00f367333f36","Type":"ContainerStarted","Data":"fd21d4e4fc96e5d650394f785baa2d711150c683f7fa6d2153912f34597b27f2"} Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.133946 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-5zps8" podStartSLOduration=3.133914004 podStartE2EDuration="3.133914004s" podCreationTimestamp="2026-03-18 15:56:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:56:50.127480703 +0000 UTC m=+1253.133654909" watchObservedRunningTime="2026-03-18 15:56:50.133914004 +0000 UTC m=+1253.140088210" Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.143863 4696 generic.go:334] "Generic (PLEG): container finished" podID="ff9ac060-ff36-40a0-802d-80559e70a6ae" containerID="8467c7bee7460795057d5e6cb6edc5fa85e386705cf5a88e5f5e120e983a1e3f" exitCode=0 Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.144012 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-sj6w5" event={"ID":"ff9ac060-ff36-40a0-802d-80559e70a6ae","Type":"ContainerDied","Data":"8467c7bee7460795057d5e6cb6edc5fa85e386705cf5a88e5f5e120e983a1e3f"} Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.146406 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-h94vl" event={"ID":"0e07ce44-d586-4f16-952b-5e1eb3b0cfa6","Type":"ContainerStarted","Data":"fc978c83701e43ec969e0cf94165ae2aacd2fcac4bf6931bdf030d1808b2cd67"} Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.146780 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-h94vl" event={"ID":"0e07ce44-d586-4f16-952b-5e1eb3b0cfa6","Type":"ContainerStarted","Data":"ec0d5bc4f1f681a5ec5198a49bb43a83d91aef0b58c7533ca216c1003ad21e2d"} Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.160914 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c77bcccc7-9tt7j" event={"ID":"6531566d-5085-46df-8412-2285f9f04c19","Type":"ContainerStarted","Data":"32c5566ed08c4730c6bf46ce48a38660bc68b1084925e192cb61a20619a4784d"} Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.176107 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9dbdd94d9-hg2vz" event={"ID":"fd1aba9c-5505-4cee-a201-bc88e5c75f92","Type":"ContainerStarted","Data":"f71abd300c76fc25d08b6bda1e227e7472ac14d6d6b48c86ce1bbf952ea0a6cb"} Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.187851 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-h94vl" podStartSLOduration=3.187799034 podStartE2EDuration="3.187799034s" podCreationTimestamp="2026-03-18 15:56:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:56:50.182018099 +0000 UTC m=+1253.188192315" watchObservedRunningTime="2026-03-18 15:56:50.187799034 +0000 UTC m=+1253.193973240" Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.516933 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b785567b9-m76ht"] Mar 18 15:56:50 crc kubenswrapper[4696]: W0318 15:56:50.595326 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80f279e9_8ebe_4ce6_b59f_727f3fd97678.slice/crio-59809f91b1c92d58ced1cee68b8f28e6cc61947bf4fcead29c2dfc99bad0ef9b WatchSource:0}: Error finding container 59809f91b1c92d58ced1cee68b8f28e6cc61947bf4fcead29c2dfc99bad0ef9b: Status 404 returned error can't find the container with id 59809f91b1c92d58ced1cee68b8f28e6cc61947bf4fcead29c2dfc99bad0ef9b Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.672925 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-sj6w5" Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.807024 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-config\") pod \"ff9ac060-ff36-40a0-802d-80559e70a6ae\" (UID: \"ff9ac060-ff36-40a0-802d-80559e70a6ae\") " Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.807091 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-dns-svc\") pod \"ff9ac060-ff36-40a0-802d-80559e70a6ae\" (UID: \"ff9ac060-ff36-40a0-802d-80559e70a6ae\") " Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.807198 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-ovsdbserver-nb\") pod \"ff9ac060-ff36-40a0-802d-80559e70a6ae\" (UID: \"ff9ac060-ff36-40a0-802d-80559e70a6ae\") " Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.807280 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-dns-swift-storage-0\") pod \"ff9ac060-ff36-40a0-802d-80559e70a6ae\" (UID: \"ff9ac060-ff36-40a0-802d-80559e70a6ae\") " Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.808963 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-ovsdbserver-sb\") pod \"ff9ac060-ff36-40a0-802d-80559e70a6ae\" (UID: \"ff9ac060-ff36-40a0-802d-80559e70a6ae\") " Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.809134 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n899l\" (UniqueName: \"kubernetes.io/projected/ff9ac060-ff36-40a0-802d-80559e70a6ae-kube-api-access-n899l\") pod \"ff9ac060-ff36-40a0-802d-80559e70a6ae\" (UID: \"ff9ac060-ff36-40a0-802d-80559e70a6ae\") " Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.816326 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff9ac060-ff36-40a0-802d-80559e70a6ae-kube-api-access-n899l" (OuterVolumeSpecName: "kube-api-access-n899l") pod "ff9ac060-ff36-40a0-802d-80559e70a6ae" (UID: "ff9ac060-ff36-40a0-802d-80559e70a6ae"). InnerVolumeSpecName "kube-api-access-n899l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.835622 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ff9ac060-ff36-40a0-802d-80559e70a6ae" (UID: "ff9ac060-ff36-40a0-802d-80559e70a6ae"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.852767 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ff9ac060-ff36-40a0-802d-80559e70a6ae" (UID: "ff9ac060-ff36-40a0-802d-80559e70a6ae"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.855718 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-config" (OuterVolumeSpecName: "config") pod "ff9ac060-ff36-40a0-802d-80559e70a6ae" (UID: "ff9ac060-ff36-40a0-802d-80559e70a6ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.875604 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ff9ac060-ff36-40a0-802d-80559e70a6ae" (UID: "ff9ac060-ff36-40a0-802d-80559e70a6ae"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.880280 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ff9ac060-ff36-40a0-802d-80559e70a6ae" (UID: "ff9ac060-ff36-40a0-802d-80559e70a6ae"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.913415 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.913479 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n899l\" (UniqueName: \"kubernetes.io/projected/ff9ac060-ff36-40a0-802d-80559e70a6ae-kube-api-access-n899l\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.913496 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.913505 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.913532 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:50 crc kubenswrapper[4696]: I0318 15:56:50.913543 4696 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff9ac060-ff36-40a0-802d-80559e70a6ae-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:51 crc kubenswrapper[4696]: I0318 15:56:51.227687 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" event={"ID":"81e814fe-6cea-48ae-88e9-00f367333f36","Type":"ContainerStarted","Data":"a9650936976e85d31e0fa8c95e50634e9be205024d6e3a5ab50acbc6dacb772f"} Mar 18 15:56:51 crc kubenswrapper[4696]: I0318 15:56:51.229168 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" Mar 18 15:56:51 crc kubenswrapper[4696]: I0318 15:56:51.241636 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-sj6w5" event={"ID":"ff9ac060-ff36-40a0-802d-80559e70a6ae","Type":"ContainerDied","Data":"6f759cdaaed7a7534d26ff5c71793631125fe322bd6dd04da0959a48a15df8d6"} Mar 18 15:56:51 crc kubenswrapper[4696]: I0318 15:56:51.241682 4696 scope.go:117] "RemoveContainer" containerID="8467c7bee7460795057d5e6cb6edc5fa85e386705cf5a88e5f5e120e983a1e3f" Mar 18 15:56:51 crc kubenswrapper[4696]: I0318 15:56:51.241869 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-sj6w5" Mar 18 15:56:51 crc kubenswrapper[4696]: I0318 15:56:51.252662 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b785567b9-m76ht" event={"ID":"80f279e9-8ebe-4ce6-b59f-727f3fd97678","Type":"ContainerStarted","Data":"59809f91b1c92d58ced1cee68b8f28e6cc61947bf4fcead29c2dfc99bad0ef9b"} Mar 18 15:56:51 crc kubenswrapper[4696]: I0318 15:56:51.259631 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" podStartSLOduration=4.25961104 podStartE2EDuration="4.25961104s" podCreationTimestamp="2026-03-18 15:56:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:56:51.249294461 +0000 UTC m=+1254.255468687" watchObservedRunningTime="2026-03-18 15:56:51.25961104 +0000 UTC m=+1254.265785246" Mar 18 15:56:51 crc kubenswrapper[4696]: I0318 15:56:51.362876 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-sj6w5"] Mar 18 15:56:51 crc kubenswrapper[4696]: I0318 15:56:51.375645 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-sj6w5"] Mar 18 15:56:51 crc kubenswrapper[4696]: I0318 15:56:51.618103 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9514b639-882b-4ae4-ad29-4e5a24c80e66" path="/var/lib/kubelet/pods/9514b639-882b-4ae4-ad29-4e5a24c80e66/volumes" Mar 18 15:56:51 crc kubenswrapper[4696]: I0318 15:56:51.621162 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff9ac060-ff36-40a0-802d-80559e70a6ae" path="/var/lib/kubelet/pods/ff9ac060-ff36-40a0-802d-80559e70a6ae/volumes" Mar 18 15:56:55 crc kubenswrapper[4696]: I0318 15:56:55.308400 4696 generic.go:334] "Generic (PLEG): container finished" podID="9eaf31d0-9c9f-437d-86b8-c2372266a25e" containerID="33bbac5754dd862078d8a24c3ef40c131595f1ee86df49baefb6cec43608b237" exitCode=0 Mar 18 15:56:55 crc kubenswrapper[4696]: I0318 15:56:55.308859 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5zps8" event={"ID":"9eaf31d0-9c9f-437d-86b8-c2372266a25e","Type":"ContainerDied","Data":"33bbac5754dd862078d8a24c3ef40c131595f1ee86df49baefb6cec43608b237"} Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.058420 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9dbdd94d9-hg2vz"] Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.092952 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-696476876d-4rxz2"] Mar 18 15:56:56 crc kubenswrapper[4696]: E0318 15:56:56.093474 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff9ac060-ff36-40a0-802d-80559e70a6ae" containerName="init" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.093504 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff9ac060-ff36-40a0-802d-80559e70a6ae" containerName="init" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.093727 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff9ac060-ff36-40a0-802d-80559e70a6ae" containerName="init" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.095184 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.099841 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.128622 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-696476876d-4rxz2"] Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.169144 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b785567b9-m76ht"] Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.199373 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-59764c649b-dxxpb"] Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.201013 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.214633 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-59764c649b-dxxpb"] Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.241846 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72021b21-00cf-4c33-be2d-b24f20dc0f9f-logs\") pod \"horizon-696476876d-4rxz2\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.241927 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72021b21-00cf-4c33-be2d-b24f20dc0f9f-scripts\") pod \"horizon-696476876d-4rxz2\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.241952 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72021b21-00cf-4c33-be2d-b24f20dc0f9f-combined-ca-bundle\") pod \"horizon-696476876d-4rxz2\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.241972 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr44m\" (UniqueName: \"kubernetes.io/projected/72021b21-00cf-4c33-be2d-b24f20dc0f9f-kube-api-access-qr44m\") pod \"horizon-696476876d-4rxz2\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.241990 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/72021b21-00cf-4c33-be2d-b24f20dc0f9f-horizon-secret-key\") pod \"horizon-696476876d-4rxz2\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.242012 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72021b21-00cf-4c33-be2d-b24f20dc0f9f-config-data\") pod \"horizon-696476876d-4rxz2\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.242064 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/72021b21-00cf-4c33-be2d-b24f20dc0f9f-horizon-tls-certs\") pod \"horizon-696476876d-4rxz2\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.339037 4696 generic.go:334] "Generic (PLEG): container finished" podID="6cfc5851-8295-4c4d-8cb4-4c18f9827227" containerID="b99d4f81a0eb301f1bf9d22f4abef1b7e592e7d87e5c4dd546f2037b2f8a9321" exitCode=0 Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.339115 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sfb9t" event={"ID":"6cfc5851-8295-4c4d-8cb4-4c18f9827227","Type":"ContainerDied","Data":"b99d4f81a0eb301f1bf9d22f4abef1b7e592e7d87e5c4dd546f2037b2f8a9321"} Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.348101 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7dqn\" (UniqueName: \"kubernetes.io/projected/abd090d6-037c-4cc7-907a-43293ce636ff-kube-api-access-t7dqn\") pod \"horizon-59764c649b-dxxpb\" (UID: \"abd090d6-037c-4cc7-907a-43293ce636ff\") " pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.348148 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abd090d6-037c-4cc7-907a-43293ce636ff-scripts\") pod \"horizon-59764c649b-dxxpb\" (UID: \"abd090d6-037c-4cc7-907a-43293ce636ff\") " pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.348174 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72021b21-00cf-4c33-be2d-b24f20dc0f9f-logs\") pod \"horizon-696476876d-4rxz2\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.348246 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72021b21-00cf-4c33-be2d-b24f20dc0f9f-scripts\") pod \"horizon-696476876d-4rxz2\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.348267 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72021b21-00cf-4c33-be2d-b24f20dc0f9f-combined-ca-bundle\") pod \"horizon-696476876d-4rxz2\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.348284 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr44m\" (UniqueName: \"kubernetes.io/projected/72021b21-00cf-4c33-be2d-b24f20dc0f9f-kube-api-access-qr44m\") pod \"horizon-696476876d-4rxz2\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.348300 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/72021b21-00cf-4c33-be2d-b24f20dc0f9f-horizon-secret-key\") pod \"horizon-696476876d-4rxz2\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.348316 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72021b21-00cf-4c33-be2d-b24f20dc0f9f-config-data\") pod \"horizon-696476876d-4rxz2\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.348340 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abd090d6-037c-4cc7-907a-43293ce636ff-logs\") pod \"horizon-59764c649b-dxxpb\" (UID: \"abd090d6-037c-4cc7-907a-43293ce636ff\") " pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.348370 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/abd090d6-037c-4cc7-907a-43293ce636ff-horizon-secret-key\") pod \"horizon-59764c649b-dxxpb\" (UID: \"abd090d6-037c-4cc7-907a-43293ce636ff\") " pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.348408 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/72021b21-00cf-4c33-be2d-b24f20dc0f9f-horizon-tls-certs\") pod \"horizon-696476876d-4rxz2\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.348423 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/abd090d6-037c-4cc7-907a-43293ce636ff-config-data\") pod \"horizon-59764c649b-dxxpb\" (UID: \"abd090d6-037c-4cc7-907a-43293ce636ff\") " pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.348445 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd090d6-037c-4cc7-907a-43293ce636ff-combined-ca-bundle\") pod \"horizon-59764c649b-dxxpb\" (UID: \"abd090d6-037c-4cc7-907a-43293ce636ff\") " pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.348463 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd090d6-037c-4cc7-907a-43293ce636ff-horizon-tls-certs\") pod \"horizon-59764c649b-dxxpb\" (UID: \"abd090d6-037c-4cc7-907a-43293ce636ff\") " pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.349016 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72021b21-00cf-4c33-be2d-b24f20dc0f9f-logs\") pod \"horizon-696476876d-4rxz2\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.349582 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72021b21-00cf-4c33-be2d-b24f20dc0f9f-scripts\") pod \"horizon-696476876d-4rxz2\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.353308 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72021b21-00cf-4c33-be2d-b24f20dc0f9f-config-data\") pod \"horizon-696476876d-4rxz2\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.357577 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/72021b21-00cf-4c33-be2d-b24f20dc0f9f-horizon-secret-key\") pod \"horizon-696476876d-4rxz2\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.358367 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72021b21-00cf-4c33-be2d-b24f20dc0f9f-combined-ca-bundle\") pod \"horizon-696476876d-4rxz2\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.371644 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/72021b21-00cf-4c33-be2d-b24f20dc0f9f-horizon-tls-certs\") pod \"horizon-696476876d-4rxz2\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.381318 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr44m\" (UniqueName: \"kubernetes.io/projected/72021b21-00cf-4c33-be2d-b24f20dc0f9f-kube-api-access-qr44m\") pod \"horizon-696476876d-4rxz2\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.448030 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.449260 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/abd090d6-037c-4cc7-907a-43293ce636ff-horizon-secret-key\") pod \"horizon-59764c649b-dxxpb\" (UID: \"abd090d6-037c-4cc7-907a-43293ce636ff\") " pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.449311 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/abd090d6-037c-4cc7-907a-43293ce636ff-config-data\") pod \"horizon-59764c649b-dxxpb\" (UID: \"abd090d6-037c-4cc7-907a-43293ce636ff\") " pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.449333 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd090d6-037c-4cc7-907a-43293ce636ff-combined-ca-bundle\") pod \"horizon-59764c649b-dxxpb\" (UID: \"abd090d6-037c-4cc7-907a-43293ce636ff\") " pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.449349 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd090d6-037c-4cc7-907a-43293ce636ff-horizon-tls-certs\") pod \"horizon-59764c649b-dxxpb\" (UID: \"abd090d6-037c-4cc7-907a-43293ce636ff\") " pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.449399 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7dqn\" (UniqueName: \"kubernetes.io/projected/abd090d6-037c-4cc7-907a-43293ce636ff-kube-api-access-t7dqn\") pod \"horizon-59764c649b-dxxpb\" (UID: \"abd090d6-037c-4cc7-907a-43293ce636ff\") " pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.449414 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abd090d6-037c-4cc7-907a-43293ce636ff-scripts\") pod \"horizon-59764c649b-dxxpb\" (UID: \"abd090d6-037c-4cc7-907a-43293ce636ff\") " pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.449474 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abd090d6-037c-4cc7-907a-43293ce636ff-logs\") pod \"horizon-59764c649b-dxxpb\" (UID: \"abd090d6-037c-4cc7-907a-43293ce636ff\") " pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.450186 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abd090d6-037c-4cc7-907a-43293ce636ff-logs\") pod \"horizon-59764c649b-dxxpb\" (UID: \"abd090d6-037c-4cc7-907a-43293ce636ff\") " pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.453357 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/abd090d6-037c-4cc7-907a-43293ce636ff-config-data\") pod \"horizon-59764c649b-dxxpb\" (UID: \"abd090d6-037c-4cc7-907a-43293ce636ff\") " pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.455011 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abd090d6-037c-4cc7-907a-43293ce636ff-scripts\") pod \"horizon-59764c649b-dxxpb\" (UID: \"abd090d6-037c-4cc7-907a-43293ce636ff\") " pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.461171 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd090d6-037c-4cc7-907a-43293ce636ff-horizon-tls-certs\") pod \"horizon-59764c649b-dxxpb\" (UID: \"abd090d6-037c-4cc7-907a-43293ce636ff\") " pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.461368 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd090d6-037c-4cc7-907a-43293ce636ff-combined-ca-bundle\") pod \"horizon-59764c649b-dxxpb\" (UID: \"abd090d6-037c-4cc7-907a-43293ce636ff\") " pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.468043 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/abd090d6-037c-4cc7-907a-43293ce636ff-horizon-secret-key\") pod \"horizon-59764c649b-dxxpb\" (UID: \"abd090d6-037c-4cc7-907a-43293ce636ff\") " pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.517331 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7dqn\" (UniqueName: \"kubernetes.io/projected/abd090d6-037c-4cc7-907a-43293ce636ff-kube-api-access-t7dqn\") pod \"horizon-59764c649b-dxxpb\" (UID: \"abd090d6-037c-4cc7-907a-43293ce636ff\") " pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:56:56 crc kubenswrapper[4696]: I0318 15:56:56.595289 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:56:58 crc kubenswrapper[4696]: I0318 15:56:58.410512 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" Mar 18 15:56:58 crc kubenswrapper[4696]: I0318 15:56:58.479699 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-5wrdf"] Mar 18 15:56:58 crc kubenswrapper[4696]: I0318 15:56:58.480062 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" podUID="c4151e21-6506-415e-9dbe-3fe4389838b6" containerName="dnsmasq-dns" containerID="cri-o://a8bd79c572b1a15ef5a9fbd43e71eaa9854e895448bcf5c07abc87817cd7fa22" gracePeriod=10 Mar 18 15:56:59 crc kubenswrapper[4696]: I0318 15:56:59.384323 4696 generic.go:334] "Generic (PLEG): container finished" podID="c4151e21-6506-415e-9dbe-3fe4389838b6" containerID="a8bd79c572b1a15ef5a9fbd43e71eaa9854e895448bcf5c07abc87817cd7fa22" exitCode=0 Mar 18 15:56:59 crc kubenswrapper[4696]: I0318 15:56:59.385061 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" event={"ID":"c4151e21-6506-415e-9dbe-3fe4389838b6","Type":"ContainerDied","Data":"a8bd79c572b1a15ef5a9fbd43e71eaa9854e895448bcf5c07abc87817cd7fa22"} Mar 18 15:57:02 crc kubenswrapper[4696]: I0318 15:57:02.448350 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" podUID="c4151e21-6506-415e-9dbe-3fe4389838b6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Mar 18 15:57:03 crc kubenswrapper[4696]: E0318 15:57:03.775304 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 18 15:57:03 crc kubenswrapper[4696]: E0318 15:57:03.776106 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nb7h655h7h64dh98h697hf8h9bh5b8h666h6h597h68fhb5h65fh5f8h557hd4h5c9h5c7h68bhbbh64h555hcch99h56bhdbh7fhcfh697h696q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5v4g7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6c77bcccc7-9tt7j_openstack(6531566d-5085-46df-8412-2285f9f04c19): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:57:03 crc kubenswrapper[4696]: E0318 15:57:03.778770 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6c77bcccc7-9tt7j" podUID="6531566d-5085-46df-8412-2285f9f04c19" Mar 18 15:57:05 crc kubenswrapper[4696]: E0318 15:57:05.390750 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Mar 18 15:57:05 crc kubenswrapper[4696]: E0318 15:57:05.391825 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-78d2r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-zfkzp_openstack(4f27b4c3-3df4-4f88-9bf1-b0f4c242d617): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:57:05 crc kubenswrapper[4696]: E0318 15:57:05.393417 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-zfkzp" podUID="4f27b4c3-3df4-4f88-9bf1-b0f4c242d617" Mar 18 15:57:05 crc kubenswrapper[4696]: E0318 15:57:05.404925 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 18 15:57:05 crc kubenswrapper[4696]: E0318 15:57:05.405122 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n545hc7h64bh5cdh5ch54dh64ch5b6h655h54h59h55fh674h545h5b8hfhfh55ch55bh659h595h596h576hfchd4hc7hf4h547h57bhd5h54h567q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dbddk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-9dbdd94d9-hg2vz_openstack(fd1aba9c-5505-4cee-a201-bc88e5c75f92): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:57:05 crc kubenswrapper[4696]: E0318 15:57:05.408610 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-9dbdd94d9-hg2vz" podUID="fd1aba9c-5505-4cee-a201-bc88e5c75f92" Mar 18 15:57:05 crc kubenswrapper[4696]: E0318 15:57:05.463370 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-zfkzp" podUID="4f27b4c3-3df4-4f88-9bf1-b0f4c242d617" Mar 18 15:57:05 crc kubenswrapper[4696]: E0318 15:57:05.821387 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 18 15:57:05 crc kubenswrapper[4696]: E0318 15:57:05.822083 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n6bh659h5ddhfdh54ch679h586h67h547h57ch589h5f4h79h57fh57fh86h7bh667h94h5f9h6dhc6h65hdbh576h668h94hfh9fhd8h5fchbfq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xjzpm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(6e295c1a-9787-42a3-ac9c-5252bda652b5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:57:05 crc kubenswrapper[4696]: I0318 15:57:05.912348 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5zps8" Mar 18 15:57:05 crc kubenswrapper[4696]: I0318 15:57:05.921941 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sfb9t" Mar 18 15:57:05 crc kubenswrapper[4696]: I0318 15:57:05.969248 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-combined-ca-bundle\") pod \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\" (UID: \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\") " Mar 18 15:57:05 crc kubenswrapper[4696]: I0318 15:57:05.969299 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-credential-keys\") pod \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\" (UID: \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\") " Mar 18 15:57:05 crc kubenswrapper[4696]: I0318 15:57:05.969348 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-fernet-keys\") pod \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\" (UID: \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\") " Mar 18 15:57:05 crc kubenswrapper[4696]: I0318 15:57:05.969402 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-scripts\") pod \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\" (UID: \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\") " Mar 18 15:57:05 crc kubenswrapper[4696]: I0318 15:57:05.969606 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmqhh\" (UniqueName: \"kubernetes.io/projected/9eaf31d0-9c9f-437d-86b8-c2372266a25e-kube-api-access-dmqhh\") pod \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\" (UID: \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\") " Mar 18 15:57:05 crc kubenswrapper[4696]: I0318 15:57:05.969680 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-config-data\") pod \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\" (UID: \"9eaf31d0-9c9f-437d-86b8-c2372266a25e\") " Mar 18 15:57:05 crc kubenswrapper[4696]: I0318 15:57:05.993781 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9eaf31d0-9c9f-437d-86b8-c2372266a25e" (UID: "9eaf31d0-9c9f-437d-86b8-c2372266a25e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.001757 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-scripts" (OuterVolumeSpecName: "scripts") pod "9eaf31d0-9c9f-437d-86b8-c2372266a25e" (UID: "9eaf31d0-9c9f-437d-86b8-c2372266a25e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.001980 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-config-data" (OuterVolumeSpecName: "config-data") pod "9eaf31d0-9c9f-437d-86b8-c2372266a25e" (UID: "9eaf31d0-9c9f-437d-86b8-c2372266a25e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.002972 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9eaf31d0-9c9f-437d-86b8-c2372266a25e" (UID: "9eaf31d0-9c9f-437d-86b8-c2372266a25e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.003358 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eaf31d0-9c9f-437d-86b8-c2372266a25e-kube-api-access-dmqhh" (OuterVolumeSpecName: "kube-api-access-dmqhh") pod "9eaf31d0-9c9f-437d-86b8-c2372266a25e" (UID: "9eaf31d0-9c9f-437d-86b8-c2372266a25e"). InnerVolumeSpecName "kube-api-access-dmqhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.044752 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9eaf31d0-9c9f-437d-86b8-c2372266a25e" (UID: "9eaf31d0-9c9f-437d-86b8-c2372266a25e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.071738 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sp7m\" (UniqueName: \"kubernetes.io/projected/6cfc5851-8295-4c4d-8cb4-4c18f9827227-kube-api-access-8sp7m\") pod \"6cfc5851-8295-4c4d-8cb4-4c18f9827227\" (UID: \"6cfc5851-8295-4c4d-8cb4-4c18f9827227\") " Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.072290 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6cfc5851-8295-4c4d-8cb4-4c18f9827227-db-sync-config-data\") pod \"6cfc5851-8295-4c4d-8cb4-4c18f9827227\" (UID: \"6cfc5851-8295-4c4d-8cb4-4c18f9827227\") " Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.072537 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cfc5851-8295-4c4d-8cb4-4c18f9827227-combined-ca-bundle\") pod \"6cfc5851-8295-4c4d-8cb4-4c18f9827227\" (UID: \"6cfc5851-8295-4c4d-8cb4-4c18f9827227\") " Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.072642 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cfc5851-8295-4c4d-8cb4-4c18f9827227-config-data\") pod \"6cfc5851-8295-4c4d-8cb4-4c18f9827227\" (UID: \"6cfc5851-8295-4c4d-8cb4-4c18f9827227\") " Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.073198 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmqhh\" (UniqueName: \"kubernetes.io/projected/9eaf31d0-9c9f-437d-86b8-c2372266a25e-kube-api-access-dmqhh\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.073302 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.073378 4696 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.073478 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.073582 4696 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.073672 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9eaf31d0-9c9f-437d-86b8-c2372266a25e-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.075999 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cfc5851-8295-4c4d-8cb4-4c18f9827227-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6cfc5851-8295-4c4d-8cb4-4c18f9827227" (UID: "6cfc5851-8295-4c4d-8cb4-4c18f9827227"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.076669 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cfc5851-8295-4c4d-8cb4-4c18f9827227-kube-api-access-8sp7m" (OuterVolumeSpecName: "kube-api-access-8sp7m") pod "6cfc5851-8295-4c4d-8cb4-4c18f9827227" (UID: "6cfc5851-8295-4c4d-8cb4-4c18f9827227"). InnerVolumeSpecName "kube-api-access-8sp7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.099941 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cfc5851-8295-4c4d-8cb4-4c18f9827227-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6cfc5851-8295-4c4d-8cb4-4c18f9827227" (UID: "6cfc5851-8295-4c4d-8cb4-4c18f9827227"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.124888 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cfc5851-8295-4c4d-8cb4-4c18f9827227-config-data" (OuterVolumeSpecName: "config-data") pod "6cfc5851-8295-4c4d-8cb4-4c18f9827227" (UID: "6cfc5851-8295-4c4d-8cb4-4c18f9827227"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.176514 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cfc5851-8295-4c4d-8cb4-4c18f9827227-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.176648 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cfc5851-8295-4c4d-8cb4-4c18f9827227-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.176666 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sp7m\" (UniqueName: \"kubernetes.io/projected/6cfc5851-8295-4c4d-8cb4-4c18f9827227-kube-api-access-8sp7m\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.176684 4696 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6cfc5851-8295-4c4d-8cb4-4c18f9827227-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:06 crc kubenswrapper[4696]: E0318 15:57:06.409112 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 18 15:57:06 crc kubenswrapper[4696]: E0318 15:57:06.409370 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n4nrh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-fxg9t_openstack(527c444b-3209-4c1e-addb-ed9404ab8efd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:57:06 crc kubenswrapper[4696]: E0318 15:57:06.410595 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-fxg9t" podUID="527c444b-3209-4c1e-addb-ed9404ab8efd" Mar 18 15:57:06 crc kubenswrapper[4696]: E0318 15:57:06.426494 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 18 15:57:06 crc kubenswrapper[4696]: E0318 15:57:06.426867 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n56fh549h7fh697h674h5bbh5dfhc8h5cchdh699hb8h8fh97h5bdh56ch6ch4hd9hb8h55bh696h599h59ch68h695h599h59h59h9ch8dh557q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6drkq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7b785567b9-m76ht_openstack(80f279e9-8ebe-4ce6-b59f-727f3fd97678): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:57:06 crc kubenswrapper[4696]: E0318 15:57:06.429674 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7b785567b9-m76ht" podUID="80f279e9-8ebe-4ce6-b59f-727f3fd97678" Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.470878 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5zps8" event={"ID":"9eaf31d0-9c9f-437d-86b8-c2372266a25e","Type":"ContainerDied","Data":"acc827e0601d3fb5fbc803c5b8429c51f060b2219f68366a941b73b64f8fc935"} Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.470929 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acc827e0601d3fb5fbc803c5b8429c51f060b2219f68366a941b73b64f8fc935" Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.470983 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5zps8" Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.475579 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sfb9t" event={"ID":"6cfc5851-8295-4c4d-8cb4-4c18f9827227","Type":"ContainerDied","Data":"a1e76f83baf6200059f7cdf8d141771e8dd3f3758566342c6298d8d5b7f2d3ce"} Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.475734 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1e76f83baf6200059f7cdf8d141771e8dd3f3758566342c6298d8d5b7f2d3ce" Mar 18 15:57:06 crc kubenswrapper[4696]: I0318 15:57:06.475654 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sfb9t" Mar 18 15:57:06 crc kubenswrapper[4696]: E0318 15:57:06.478039 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-fxg9t" podUID="527c444b-3209-4c1e-addb-ed9404ab8efd" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.038922 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-5zps8"] Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.049031 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-5zps8"] Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.115190 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-jvqtp"] Mar 18 15:57:07 crc kubenswrapper[4696]: E0318 15:57:07.115850 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cfc5851-8295-4c4d-8cb4-4c18f9827227" containerName="glance-db-sync" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.115873 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cfc5851-8295-4c4d-8cb4-4c18f9827227" containerName="glance-db-sync" Mar 18 15:57:07 crc kubenswrapper[4696]: E0318 15:57:07.115888 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eaf31d0-9c9f-437d-86b8-c2372266a25e" containerName="keystone-bootstrap" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.115895 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eaf31d0-9c9f-437d-86b8-c2372266a25e" containerName="keystone-bootstrap" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.116096 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cfc5851-8295-4c4d-8cb4-4c18f9827227" containerName="glance-db-sync" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.116118 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="9eaf31d0-9c9f-437d-86b8-c2372266a25e" containerName="keystone-bootstrap" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.116779 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jvqtp" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.118713 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.119071 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.119291 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pdpjw" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.119462 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.119631 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.132125 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jvqtp"] Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.198976 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-scripts\") pod \"keystone-bootstrap-jvqtp\" (UID: \"319d79c7-8160-4b57-9b13-3797a015cbdf\") " pod="openstack/keystone-bootstrap-jvqtp" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.199024 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-credential-keys\") pod \"keystone-bootstrap-jvqtp\" (UID: \"319d79c7-8160-4b57-9b13-3797a015cbdf\") " pod="openstack/keystone-bootstrap-jvqtp" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.199049 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-config-data\") pod \"keystone-bootstrap-jvqtp\" (UID: \"319d79c7-8160-4b57-9b13-3797a015cbdf\") " pod="openstack/keystone-bootstrap-jvqtp" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.199107 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-combined-ca-bundle\") pod \"keystone-bootstrap-jvqtp\" (UID: \"319d79c7-8160-4b57-9b13-3797a015cbdf\") " pod="openstack/keystone-bootstrap-jvqtp" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.199166 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-fernet-keys\") pod \"keystone-bootstrap-jvqtp\" (UID: \"319d79c7-8160-4b57-9b13-3797a015cbdf\") " pod="openstack/keystone-bootstrap-jvqtp" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.199193 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckz6z\" (UniqueName: \"kubernetes.io/projected/319d79c7-8160-4b57-9b13-3797a015cbdf-kube-api-access-ckz6z\") pod \"keystone-bootstrap-jvqtp\" (UID: \"319d79c7-8160-4b57-9b13-3797a015cbdf\") " pod="openstack/keystone-bootstrap-jvqtp" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.300321 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-combined-ca-bundle\") pod \"keystone-bootstrap-jvqtp\" (UID: \"319d79c7-8160-4b57-9b13-3797a015cbdf\") " pod="openstack/keystone-bootstrap-jvqtp" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.300423 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-fernet-keys\") pod \"keystone-bootstrap-jvqtp\" (UID: \"319d79c7-8160-4b57-9b13-3797a015cbdf\") " pod="openstack/keystone-bootstrap-jvqtp" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.300490 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckz6z\" (UniqueName: \"kubernetes.io/projected/319d79c7-8160-4b57-9b13-3797a015cbdf-kube-api-access-ckz6z\") pod \"keystone-bootstrap-jvqtp\" (UID: \"319d79c7-8160-4b57-9b13-3797a015cbdf\") " pod="openstack/keystone-bootstrap-jvqtp" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.300932 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-scripts\") pod \"keystone-bootstrap-jvqtp\" (UID: \"319d79c7-8160-4b57-9b13-3797a015cbdf\") " pod="openstack/keystone-bootstrap-jvqtp" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.300966 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-credential-keys\") pod \"keystone-bootstrap-jvqtp\" (UID: \"319d79c7-8160-4b57-9b13-3797a015cbdf\") " pod="openstack/keystone-bootstrap-jvqtp" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.301469 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-config-data\") pod \"keystone-bootstrap-jvqtp\" (UID: \"319d79c7-8160-4b57-9b13-3797a015cbdf\") " pod="openstack/keystone-bootstrap-jvqtp" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.303682 4696 scope.go:117] "RemoveContainer" containerID="67687a9e1a40fad3c084ffa78922a2daf272549d0af7985cbea14e5473be0f9e" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.310272 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-scripts\") pod \"keystone-bootstrap-jvqtp\" (UID: \"319d79c7-8160-4b57-9b13-3797a015cbdf\") " pod="openstack/keystone-bootstrap-jvqtp" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.318356 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-combined-ca-bundle\") pod \"keystone-bootstrap-jvqtp\" (UID: \"319d79c7-8160-4b57-9b13-3797a015cbdf\") " pod="openstack/keystone-bootstrap-jvqtp" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.318596 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-fernet-keys\") pod \"keystone-bootstrap-jvqtp\" (UID: \"319d79c7-8160-4b57-9b13-3797a015cbdf\") " pod="openstack/keystone-bootstrap-jvqtp" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.318697 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-config-data\") pod \"keystone-bootstrap-jvqtp\" (UID: \"319d79c7-8160-4b57-9b13-3797a015cbdf\") " pod="openstack/keystone-bootstrap-jvqtp" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.318911 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-credential-keys\") pod \"keystone-bootstrap-jvqtp\" (UID: \"319d79c7-8160-4b57-9b13-3797a015cbdf\") " pod="openstack/keystone-bootstrap-jvqtp" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.331086 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckz6z\" (UniqueName: \"kubernetes.io/projected/319d79c7-8160-4b57-9b13-3797a015cbdf-kube-api-access-ckz6z\") pod \"keystone-bootstrap-jvqtp\" (UID: \"319d79c7-8160-4b57-9b13-3797a015cbdf\") " pod="openstack/keystone-bootstrap-jvqtp" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.442191 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jvqtp" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.503737 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-f6fkg"] Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.506073 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.532385 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-f6fkg"] Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.610822 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-f6fkg\" (UID: \"4416fa4b-8d2a-4580-bdf8-332129f418cf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.610956 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-config\") pod \"dnsmasq-dns-56df8fb6b7-f6fkg\" (UID: \"4416fa4b-8d2a-4580-bdf8-332129f418cf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.611001 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt4m8\" (UniqueName: \"kubernetes.io/projected/4416fa4b-8d2a-4580-bdf8-332129f418cf-kube-api-access-qt4m8\") pod \"dnsmasq-dns-56df8fb6b7-f6fkg\" (UID: \"4416fa4b-8d2a-4580-bdf8-332129f418cf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.611064 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-f6fkg\" (UID: \"4416fa4b-8d2a-4580-bdf8-332129f418cf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.611211 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-f6fkg\" (UID: \"4416fa4b-8d2a-4580-bdf8-332129f418cf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.611387 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-f6fkg\" (UID: \"4416fa4b-8d2a-4580-bdf8-332129f418cf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.623859 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9eaf31d0-9c9f-437d-86b8-c2372266a25e" path="/var/lib/kubelet/pods/9eaf31d0-9c9f-437d-86b8-c2372266a25e/volumes" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.712901 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-f6fkg\" (UID: \"4416fa4b-8d2a-4580-bdf8-332129f418cf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.713002 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-f6fkg\" (UID: \"4416fa4b-8d2a-4580-bdf8-332129f418cf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.713064 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-config\") pod \"dnsmasq-dns-56df8fb6b7-f6fkg\" (UID: \"4416fa4b-8d2a-4580-bdf8-332129f418cf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.713090 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt4m8\" (UniqueName: \"kubernetes.io/projected/4416fa4b-8d2a-4580-bdf8-332129f418cf-kube-api-access-qt4m8\") pod \"dnsmasq-dns-56df8fb6b7-f6fkg\" (UID: \"4416fa4b-8d2a-4580-bdf8-332129f418cf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.713117 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-f6fkg\" (UID: \"4416fa4b-8d2a-4580-bdf8-332129f418cf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.713195 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-f6fkg\" (UID: \"4416fa4b-8d2a-4580-bdf8-332129f418cf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.714381 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-f6fkg\" (UID: \"4416fa4b-8d2a-4580-bdf8-332129f418cf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.714381 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-config\") pod \"dnsmasq-dns-56df8fb6b7-f6fkg\" (UID: \"4416fa4b-8d2a-4580-bdf8-332129f418cf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.714635 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-f6fkg\" (UID: \"4416fa4b-8d2a-4580-bdf8-332129f418cf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.714704 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-f6fkg\" (UID: \"4416fa4b-8d2a-4580-bdf8-332129f418cf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.715038 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-f6fkg\" (UID: \"4416fa4b-8d2a-4580-bdf8-332129f418cf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.748346 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt4m8\" (UniqueName: \"kubernetes.io/projected/4416fa4b-8d2a-4580-bdf8-332129f418cf-kube-api-access-qt4m8\") pod \"dnsmasq-dns-56df8fb6b7-f6fkg\" (UID: \"4416fa4b-8d2a-4580-bdf8-332129f418cf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" Mar 18 15:57:07 crc kubenswrapper[4696]: I0318 15:57:07.830458 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.352471 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.364570 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.364743 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.369149 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.369325 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6rg96" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.374441 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.426692 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0800565-af0f-408d-8da5-81341ad3e2af-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.426960 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0800565-af0f-408d-8da5-81341ad3e2af-logs\") pod \"glance-default-external-api-0\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.427313 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0800565-af0f-408d-8da5-81341ad3e2af-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.427405 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.427567 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89frh\" (UniqueName: \"kubernetes.io/projected/f0800565-af0f-408d-8da5-81341ad3e2af-kube-api-access-89frh\") pod \"glance-default-external-api-0\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.427646 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0800565-af0f-408d-8da5-81341ad3e2af-scripts\") pod \"glance-default-external-api-0\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.427779 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0800565-af0f-408d-8da5-81341ad3e2af-config-data\") pod \"glance-default-external-api-0\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.530597 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0800565-af0f-408d-8da5-81341ad3e2af-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.531691 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.531859 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89frh\" (UniqueName: \"kubernetes.io/projected/f0800565-af0f-408d-8da5-81341ad3e2af-kube-api-access-89frh\") pod \"glance-default-external-api-0\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.531983 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0800565-af0f-408d-8da5-81341ad3e2af-scripts\") pod \"glance-default-external-api-0\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.532088 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0800565-af0f-408d-8da5-81341ad3e2af-config-data\") pod \"glance-default-external-api-0\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.532313 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0800565-af0f-408d-8da5-81341ad3e2af-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.532474 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0800565-af0f-408d-8da5-81341ad3e2af-logs\") pod \"glance-default-external-api-0\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.533476 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0800565-af0f-408d-8da5-81341ad3e2af-logs\") pod \"glance-default-external-api-0\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.533939 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0800565-af0f-408d-8da5-81341ad3e2af-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.534419 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.545137 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0800565-af0f-408d-8da5-81341ad3e2af-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.546239 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0800565-af0f-408d-8da5-81341ad3e2af-scripts\") pod \"glance-default-external-api-0\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.546916 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0800565-af0f-408d-8da5-81341ad3e2af-config-data\") pod \"glance-default-external-api-0\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.559967 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89frh\" (UniqueName: \"kubernetes.io/projected/f0800565-af0f-408d-8da5-81341ad3e2af-kube-api-access-89frh\") pod \"glance-default-external-api-0\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.578831 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.681905 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.683473 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.686375 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.701737 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.726342 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.749234 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hcdw\" (UniqueName: \"kubernetes.io/projected/204d5c0e-9847-4b8f-be9a-9592da9f389c-kube-api-access-4hcdw\") pod \"glance-default-internal-api-0\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.749338 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/204d5c0e-9847-4b8f-be9a-9592da9f389c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.749390 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/204d5c0e-9847-4b8f-be9a-9592da9f389c-logs\") pod \"glance-default-internal-api-0\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.749415 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204d5c0e-9847-4b8f-be9a-9592da9f389c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.749550 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204d5c0e-9847-4b8f-be9a-9592da9f389c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.749581 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/204d5c0e-9847-4b8f-be9a-9592da9f389c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.749643 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.850650 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/204d5c0e-9847-4b8f-be9a-9592da9f389c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.851121 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/204d5c0e-9847-4b8f-be9a-9592da9f389c-logs\") pod \"glance-default-internal-api-0\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.851158 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204d5c0e-9847-4b8f-be9a-9592da9f389c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.851207 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204d5c0e-9847-4b8f-be9a-9592da9f389c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.851232 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/204d5c0e-9847-4b8f-be9a-9592da9f389c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.851295 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.851364 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hcdw\" (UniqueName: \"kubernetes.io/projected/204d5c0e-9847-4b8f-be9a-9592da9f389c-kube-api-access-4hcdw\") pod \"glance-default-internal-api-0\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.852924 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.853060 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/204d5c0e-9847-4b8f-be9a-9592da9f389c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.853194 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/204d5c0e-9847-4b8f-be9a-9592da9f389c-logs\") pod \"glance-default-internal-api-0\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.856114 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/204d5c0e-9847-4b8f-be9a-9592da9f389c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.856828 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204d5c0e-9847-4b8f-be9a-9592da9f389c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.872099 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hcdw\" (UniqueName: \"kubernetes.io/projected/204d5c0e-9847-4b8f-be9a-9592da9f389c-kube-api-access-4hcdw\") pod \"glance-default-internal-api-0\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.884995 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204d5c0e-9847-4b8f-be9a-9592da9f389c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:08 crc kubenswrapper[4696]: I0318 15:57:08.897478 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:09 crc kubenswrapper[4696]: I0318 15:57:09.034574 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 15:57:10 crc kubenswrapper[4696]: I0318 15:57:10.629117 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 15:57:10 crc kubenswrapper[4696]: I0318 15:57:10.720685 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 15:57:12 crc kubenswrapper[4696]: I0318 15:57:12.448261 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" podUID="c4151e21-6506-415e-9dbe-3fe4389838b6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: i/o timeout" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.398281 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.450359 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c77bcccc7-9tt7j" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.451572 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9dbdd94d9-hg2vz" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.452657 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b785567b9-m76ht" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.548548 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4151e21-6506-415e-9dbe-3fe4389838b6-ovsdbserver-sb\") pod \"c4151e21-6506-415e-9dbe-3fe4389838b6\" (UID: \"c4151e21-6506-415e-9dbe-3fe4389838b6\") " Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.548676 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4151e21-6506-415e-9dbe-3fe4389838b6-config\") pod \"c4151e21-6506-415e-9dbe-3fe4389838b6\" (UID: \"c4151e21-6506-415e-9dbe-3fe4389838b6\") " Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.548792 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k67ww\" (UniqueName: \"kubernetes.io/projected/c4151e21-6506-415e-9dbe-3fe4389838b6-kube-api-access-k67ww\") pod \"c4151e21-6506-415e-9dbe-3fe4389838b6\" (UID: \"c4151e21-6506-415e-9dbe-3fe4389838b6\") " Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.548840 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4151e21-6506-415e-9dbe-3fe4389838b6-ovsdbserver-nb\") pod \"c4151e21-6506-415e-9dbe-3fe4389838b6\" (UID: \"c4151e21-6506-415e-9dbe-3fe4389838b6\") " Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.548926 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4151e21-6506-415e-9dbe-3fe4389838b6-dns-svc\") pod \"c4151e21-6506-415e-9dbe-3fe4389838b6\" (UID: \"c4151e21-6506-415e-9dbe-3fe4389838b6\") " Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.562031 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4151e21-6506-415e-9dbe-3fe4389838b6-kube-api-access-k67ww" (OuterVolumeSpecName: "kube-api-access-k67ww") pod "c4151e21-6506-415e-9dbe-3fe4389838b6" (UID: "c4151e21-6506-415e-9dbe-3fe4389838b6"). InnerVolumeSpecName "kube-api-access-k67ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.608385 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c77bcccc7-9tt7j" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.608421 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c77bcccc7-9tt7j" event={"ID":"6531566d-5085-46df-8412-2285f9f04c19","Type":"ContainerDied","Data":"32c5566ed08c4730c6bf46ce48a38660bc68b1084925e192cb61a20619a4784d"} Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.611230 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9dbdd94d9-hg2vz" event={"ID":"fd1aba9c-5505-4cee-a201-bc88e5c75f92","Type":"ContainerDied","Data":"f71abd300c76fc25d08b6bda1e227e7472ac14d6d6b48c86ce1bbf952ea0a6cb"} Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.611355 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9dbdd94d9-hg2vz" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.615611 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b785567b9-m76ht" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.615645 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b785567b9-m76ht" event={"ID":"80f279e9-8ebe-4ce6-b59f-727f3fd97678","Type":"ContainerDied","Data":"59809f91b1c92d58ced1cee68b8f28e6cc61947bf4fcead29c2dfc99bad0ef9b"} Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.617005 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4151e21-6506-415e-9dbe-3fe4389838b6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c4151e21-6506-415e-9dbe-3fe4389838b6" (UID: "c4151e21-6506-415e-9dbe-3fe4389838b6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.623756 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4151e21-6506-415e-9dbe-3fe4389838b6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c4151e21-6506-415e-9dbe-3fe4389838b6" (UID: "c4151e21-6506-415e-9dbe-3fe4389838b6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.624890 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" event={"ID":"c4151e21-6506-415e-9dbe-3fe4389838b6","Type":"ContainerDied","Data":"c26e8846f0e06cdf3c47a2cbb873eca24e40de0cb6839d925c649c4d83e25535"} Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.624968 4696 scope.go:117] "RemoveContainer" containerID="a8bd79c572b1a15ef5a9fbd43e71eaa9854e895448bcf5c07abc87817cd7fa22" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.625182 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.637144 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4151e21-6506-415e-9dbe-3fe4389838b6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c4151e21-6506-415e-9dbe-3fe4389838b6" (UID: "c4151e21-6506-415e-9dbe-3fe4389838b6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.644933 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4151e21-6506-415e-9dbe-3fe4389838b6-config" (OuterVolumeSpecName: "config") pod "c4151e21-6506-415e-9dbe-3fe4389838b6" (UID: "c4151e21-6506-415e-9dbe-3fe4389838b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.662863 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbddk\" (UniqueName: \"kubernetes.io/projected/fd1aba9c-5505-4cee-a201-bc88e5c75f92-kube-api-access-dbddk\") pod \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\" (UID: \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\") " Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.663002 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v4g7\" (UniqueName: \"kubernetes.io/projected/6531566d-5085-46df-8412-2285f9f04c19-kube-api-access-5v4g7\") pod \"6531566d-5085-46df-8412-2285f9f04c19\" (UID: \"6531566d-5085-46df-8412-2285f9f04c19\") " Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.663040 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fd1aba9c-5505-4cee-a201-bc88e5c75f92-horizon-secret-key\") pod \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\" (UID: \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\") " Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.663075 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6drkq\" (UniqueName: \"kubernetes.io/projected/80f279e9-8ebe-4ce6-b59f-727f3fd97678-kube-api-access-6drkq\") pod \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\" (UID: \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\") " Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.663186 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd1aba9c-5505-4cee-a201-bc88e5c75f92-scripts\") pod \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\" (UID: \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\") " Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.663221 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd1aba9c-5505-4cee-a201-bc88e5c75f92-config-data\") pod \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\" (UID: \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\") " Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.663256 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/80f279e9-8ebe-4ce6-b59f-727f3fd97678-horizon-secret-key\") pod \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\" (UID: \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\") " Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.663358 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6531566d-5085-46df-8412-2285f9f04c19-horizon-secret-key\") pod \"6531566d-5085-46df-8412-2285f9f04c19\" (UID: \"6531566d-5085-46df-8412-2285f9f04c19\") " Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.663396 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/80f279e9-8ebe-4ce6-b59f-727f3fd97678-config-data\") pod \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\" (UID: \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\") " Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.663443 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80f279e9-8ebe-4ce6-b59f-727f3fd97678-logs\") pod \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\" (UID: \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\") " Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.663574 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6531566d-5085-46df-8412-2285f9f04c19-scripts\") pod \"6531566d-5085-46df-8412-2285f9f04c19\" (UID: \"6531566d-5085-46df-8412-2285f9f04c19\") " Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.663625 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6531566d-5085-46df-8412-2285f9f04c19-logs\") pod \"6531566d-5085-46df-8412-2285f9f04c19\" (UID: \"6531566d-5085-46df-8412-2285f9f04c19\") " Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.663662 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd1aba9c-5505-4cee-a201-bc88e5c75f92-logs\") pod \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\" (UID: \"fd1aba9c-5505-4cee-a201-bc88e5c75f92\") " Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.663737 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6531566d-5085-46df-8412-2285f9f04c19-config-data\") pod \"6531566d-5085-46df-8412-2285f9f04c19\" (UID: \"6531566d-5085-46df-8412-2285f9f04c19\") " Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.663774 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80f279e9-8ebe-4ce6-b59f-727f3fd97678-scripts\") pod \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\" (UID: \"80f279e9-8ebe-4ce6-b59f-727f3fd97678\") " Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.664389 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd1aba9c-5505-4cee-a201-bc88e5c75f92-scripts" (OuterVolumeSpecName: "scripts") pod "fd1aba9c-5505-4cee-a201-bc88e5c75f92" (UID: "fd1aba9c-5505-4cee-a201-bc88e5c75f92"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.664653 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80f279e9-8ebe-4ce6-b59f-727f3fd97678-logs" (OuterVolumeSpecName: "logs") pod "80f279e9-8ebe-4ce6-b59f-727f3fd97678" (UID: "80f279e9-8ebe-4ce6-b59f-727f3fd97678"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.666218 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd1aba9c-5505-4cee-a201-bc88e5c75f92-config-data" (OuterVolumeSpecName: "config-data") pod "fd1aba9c-5505-4cee-a201-bc88e5c75f92" (UID: "fd1aba9c-5505-4cee-a201-bc88e5c75f92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.666788 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd1aba9c-5505-4cee-a201-bc88e5c75f92-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.666818 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd1aba9c-5505-4cee-a201-bc88e5c75f92-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.666833 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k67ww\" (UniqueName: \"kubernetes.io/projected/c4151e21-6506-415e-9dbe-3fe4389838b6-kube-api-access-k67ww\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.666849 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80f279e9-8ebe-4ce6-b59f-727f3fd97678-logs\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.666863 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4151e21-6506-415e-9dbe-3fe4389838b6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.666877 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4151e21-6506-415e-9dbe-3fe4389838b6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.666891 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4151e21-6506-415e-9dbe-3fe4389838b6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.666903 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4151e21-6506-415e-9dbe-3fe4389838b6-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.666846 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80f279e9-8ebe-4ce6-b59f-727f3fd97678-scripts" (OuterVolumeSpecName: "scripts") pod "80f279e9-8ebe-4ce6-b59f-727f3fd97678" (UID: "80f279e9-8ebe-4ce6-b59f-727f3fd97678"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.667265 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd1aba9c-5505-4cee-a201-bc88e5c75f92-logs" (OuterVolumeSpecName: "logs") pod "fd1aba9c-5505-4cee-a201-bc88e5c75f92" (UID: "fd1aba9c-5505-4cee-a201-bc88e5c75f92"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.669241 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd1aba9c-5505-4cee-a201-bc88e5c75f92-kube-api-access-dbddk" (OuterVolumeSpecName: "kube-api-access-dbddk") pod "fd1aba9c-5505-4cee-a201-bc88e5c75f92" (UID: "fd1aba9c-5505-4cee-a201-bc88e5c75f92"). InnerVolumeSpecName "kube-api-access-dbddk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.669283 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6531566d-5085-46df-8412-2285f9f04c19-scripts" (OuterVolumeSpecName: "scripts") pod "6531566d-5085-46df-8412-2285f9f04c19" (UID: "6531566d-5085-46df-8412-2285f9f04c19"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.669898 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80f279e9-8ebe-4ce6-b59f-727f3fd97678-config-data" (OuterVolumeSpecName: "config-data") pod "80f279e9-8ebe-4ce6-b59f-727f3fd97678" (UID: "80f279e9-8ebe-4ce6-b59f-727f3fd97678"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.670398 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6531566d-5085-46df-8412-2285f9f04c19-logs" (OuterVolumeSpecName: "logs") pod "6531566d-5085-46df-8412-2285f9f04c19" (UID: "6531566d-5085-46df-8412-2285f9f04c19"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.671108 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6531566d-5085-46df-8412-2285f9f04c19-config-data" (OuterVolumeSpecName: "config-data") pod "6531566d-5085-46df-8412-2285f9f04c19" (UID: "6531566d-5085-46df-8412-2285f9f04c19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.671306 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6531566d-5085-46df-8412-2285f9f04c19-kube-api-access-5v4g7" (OuterVolumeSpecName: "kube-api-access-5v4g7") pod "6531566d-5085-46df-8412-2285f9f04c19" (UID: "6531566d-5085-46df-8412-2285f9f04c19"). InnerVolumeSpecName "kube-api-access-5v4g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.671844 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80f279e9-8ebe-4ce6-b59f-727f3fd97678-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "80f279e9-8ebe-4ce6-b59f-727f3fd97678" (UID: "80f279e9-8ebe-4ce6-b59f-727f3fd97678"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.675300 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80f279e9-8ebe-4ce6-b59f-727f3fd97678-kube-api-access-6drkq" (OuterVolumeSpecName: "kube-api-access-6drkq") pod "80f279e9-8ebe-4ce6-b59f-727f3fd97678" (UID: "80f279e9-8ebe-4ce6-b59f-727f3fd97678"). InnerVolumeSpecName "kube-api-access-6drkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.675311 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd1aba9c-5505-4cee-a201-bc88e5c75f92-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "fd1aba9c-5505-4cee-a201-bc88e5c75f92" (UID: "fd1aba9c-5505-4cee-a201-bc88e5c75f92"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.675666 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6531566d-5085-46df-8412-2285f9f04c19-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6531566d-5085-46df-8412-2285f9f04c19" (UID: "6531566d-5085-46df-8412-2285f9f04c19"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.768266 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6531566d-5085-46df-8412-2285f9f04c19-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.768297 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6531566d-5085-46df-8412-2285f9f04c19-logs\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.768309 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd1aba9c-5505-4cee-a201-bc88e5c75f92-logs\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.768319 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6531566d-5085-46df-8412-2285f9f04c19-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.768331 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80f279e9-8ebe-4ce6-b59f-727f3fd97678-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.768341 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbddk\" (UniqueName: \"kubernetes.io/projected/fd1aba9c-5505-4cee-a201-bc88e5c75f92-kube-api-access-dbddk\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.768353 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6drkq\" (UniqueName: \"kubernetes.io/projected/80f279e9-8ebe-4ce6-b59f-727f3fd97678-kube-api-access-6drkq\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.768362 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v4g7\" (UniqueName: \"kubernetes.io/projected/6531566d-5085-46df-8412-2285f9f04c19-kube-api-access-5v4g7\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.768373 4696 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fd1aba9c-5505-4cee-a201-bc88e5c75f92-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.768383 4696 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/80f279e9-8ebe-4ce6-b59f-727f3fd97678-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.768391 4696 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6531566d-5085-46df-8412-2285f9f04c19-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.768399 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/80f279e9-8ebe-4ce6-b59f-727f3fd97678-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:16 crc kubenswrapper[4696]: I0318 15:57:16.837277 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-696476876d-4rxz2"] Mar 18 15:57:17 crc kubenswrapper[4696]: I0318 15:57:17.001844 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9dbdd94d9-hg2vz"] Mar 18 15:57:17 crc kubenswrapper[4696]: I0318 15:57:17.012603 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-9dbdd94d9-hg2vz"] Mar 18 15:57:17 crc kubenswrapper[4696]: I0318 15:57:17.041333 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6c77bcccc7-9tt7j"] Mar 18 15:57:17 crc kubenswrapper[4696]: I0318 15:57:17.050747 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6c77bcccc7-9tt7j"] Mar 18 15:57:17 crc kubenswrapper[4696]: I0318 15:57:17.068416 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-5wrdf"] Mar 18 15:57:17 crc kubenswrapper[4696]: I0318 15:57:17.083237 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-5wrdf"] Mar 18 15:57:17 crc kubenswrapper[4696]: I0318 15:57:17.097594 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b785567b9-m76ht"] Mar 18 15:57:17 crc kubenswrapper[4696]: I0318 15:57:17.101201 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7b785567b9-m76ht"] Mar 18 15:57:17 crc kubenswrapper[4696]: I0318 15:57:17.448718 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-5wrdf" podUID="c4151e21-6506-415e-9dbe-3fe4389838b6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: i/o timeout" Mar 18 15:57:17 crc kubenswrapper[4696]: I0318 15:57:17.618151 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6531566d-5085-46df-8412-2285f9f04c19" path="/var/lib/kubelet/pods/6531566d-5085-46df-8412-2285f9f04c19/volumes" Mar 18 15:57:17 crc kubenswrapper[4696]: I0318 15:57:17.619137 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80f279e9-8ebe-4ce6-b59f-727f3fd97678" path="/var/lib/kubelet/pods/80f279e9-8ebe-4ce6-b59f-727f3fd97678/volumes" Mar 18 15:57:17 crc kubenswrapper[4696]: I0318 15:57:17.619691 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4151e21-6506-415e-9dbe-3fe4389838b6" path="/var/lib/kubelet/pods/c4151e21-6506-415e-9dbe-3fe4389838b6/volumes" Mar 18 15:57:17 crc kubenswrapper[4696]: I0318 15:57:17.620909 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd1aba9c-5505-4cee-a201-bc88e5c75f92" path="/var/lib/kubelet/pods/fd1aba9c-5505-4cee-a201-bc88e5c75f92/volumes" Mar 18 15:57:17 crc kubenswrapper[4696]: I0318 15:57:17.639015 4696 generic.go:334] "Generic (PLEG): container finished" podID="0e07ce44-d586-4f16-952b-5e1eb3b0cfa6" containerID="fc978c83701e43ec969e0cf94165ae2aacd2fcac4bf6931bdf030d1808b2cd67" exitCode=0 Mar 18 15:57:17 crc kubenswrapper[4696]: I0318 15:57:17.639065 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-h94vl" event={"ID":"0e07ce44-d586-4f16-952b-5e1eb3b0cfa6","Type":"ContainerDied","Data":"fc978c83701e43ec969e0cf94165ae2aacd2fcac4bf6931bdf030d1808b2cd67"} Mar 18 15:57:17 crc kubenswrapper[4696]: E0318 15:57:17.821613 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 18 15:57:17 crc kubenswrapper[4696]: E0318 15:57:17.821911 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5kwjw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-v2tzn_openstack(4a76866d-35bf-4dee-8fc4-a5c018e9edce): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:57:17 crc kubenswrapper[4696]: E0318 15:57:17.823341 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-v2tzn" podUID="4a76866d-35bf-4dee-8fc4-a5c018e9edce" Mar 18 15:57:17 crc kubenswrapper[4696]: W0318 15:57:17.867880 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72021b21_00cf_4c33_be2d_b24f20dc0f9f.slice/crio-d4ca6da5c146bca7f3b27142eac4ced4392eec6952db9bb6b4a57aac46b40ec9 WatchSource:0}: Error finding container d4ca6da5c146bca7f3b27142eac4ced4392eec6952db9bb6b4a57aac46b40ec9: Status 404 returned error can't find the container with id d4ca6da5c146bca7f3b27142eac4ced4392eec6952db9bb6b4a57aac46b40ec9 Mar 18 15:57:18 crc kubenswrapper[4696]: I0318 15:57:18.218147 4696 scope.go:117] "RemoveContainer" containerID="c1d07a961f3cc3785a93f226c44aa10c3e1288c80e6465e883ffdef7abb82568" Mar 18 15:57:18 crc kubenswrapper[4696]: I0318 15:57:18.707442 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-696476876d-4rxz2" event={"ID":"72021b21-00cf-4c33-be2d-b24f20dc0f9f","Type":"ContainerStarted","Data":"d4ca6da5c146bca7f3b27142eac4ced4392eec6952db9bb6b4a57aac46b40ec9"} Mar 18 15:57:18 crc kubenswrapper[4696]: W0318 15:57:18.711496 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabd090d6_037c_4cc7_907a_43293ce636ff.slice/crio-81e341330257c811eb43bb6411207c00c9872e17ae6a49a4e95c7843a3b02418 WatchSource:0}: Error finding container 81e341330257c811eb43bb6411207c00c9872e17ae6a49a4e95c7843a3b02418: Status 404 returned error can't find the container with id 81e341330257c811eb43bb6411207c00c9872e17ae6a49a4e95c7843a3b02418 Mar 18 15:57:18 crc kubenswrapper[4696]: E0318 15:57:18.717708 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-v2tzn" podUID="4a76866d-35bf-4dee-8fc4-a5c018e9edce" Mar 18 15:57:18 crc kubenswrapper[4696]: I0318 15:57:18.724023 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-59764c649b-dxxpb"] Mar 18 15:57:18 crc kubenswrapper[4696]: I0318 15:57:18.870418 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jvqtp"] Mar 18 15:57:19 crc kubenswrapper[4696]: I0318 15:57:19.072129 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-f6fkg"] Mar 18 15:57:19 crc kubenswrapper[4696]: W0318 15:57:19.152169 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4416fa4b_8d2a_4580_bdf8_332129f418cf.slice/crio-5fb1bd455a41056291ea6e4774770b23d0c85f3a5388b0cf88dc824d57ab99ed WatchSource:0}: Error finding container 5fb1bd455a41056291ea6e4774770b23d0c85f3a5388b0cf88dc824d57ab99ed: Status 404 returned error can't find the container with id 5fb1bd455a41056291ea6e4774770b23d0c85f3a5388b0cf88dc824d57ab99ed Mar 18 15:57:19 crc kubenswrapper[4696]: W0318 15:57:19.162868 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod204d5c0e_9847_4b8f_be9a_9592da9f389c.slice/crio-3aaa43f2ea0614bb5824e095078d9fd6d396d62ea4c18af3c3ffa56aa70910c8 WatchSource:0}: Error finding container 3aaa43f2ea0614bb5824e095078d9fd6d396d62ea4c18af3c3ffa56aa70910c8: Status 404 returned error can't find the container with id 3aaa43f2ea0614bb5824e095078d9fd6d396d62ea4c18af3c3ffa56aa70910c8 Mar 18 15:57:19 crc kubenswrapper[4696]: I0318 15:57:19.174895 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 15:57:19 crc kubenswrapper[4696]: I0318 15:57:19.236958 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-h94vl" Mar 18 15:57:19 crc kubenswrapper[4696]: I0318 15:57:19.242309 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 15:57:19 crc kubenswrapper[4696]: I0318 15:57:19.432718 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e07ce44-d586-4f16-952b-5e1eb3b0cfa6-combined-ca-bundle\") pod \"0e07ce44-d586-4f16-952b-5e1eb3b0cfa6\" (UID: \"0e07ce44-d586-4f16-952b-5e1eb3b0cfa6\") " Mar 18 15:57:19 crc kubenswrapper[4696]: I0318 15:57:19.433288 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e07ce44-d586-4f16-952b-5e1eb3b0cfa6-config\") pod \"0e07ce44-d586-4f16-952b-5e1eb3b0cfa6\" (UID: \"0e07ce44-d586-4f16-952b-5e1eb3b0cfa6\") " Mar 18 15:57:19 crc kubenswrapper[4696]: I0318 15:57:19.433368 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdlmq\" (UniqueName: \"kubernetes.io/projected/0e07ce44-d586-4f16-952b-5e1eb3b0cfa6-kube-api-access-pdlmq\") pod \"0e07ce44-d586-4f16-952b-5e1eb3b0cfa6\" (UID: \"0e07ce44-d586-4f16-952b-5e1eb3b0cfa6\") " Mar 18 15:57:19 crc kubenswrapper[4696]: I0318 15:57:19.442445 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e07ce44-d586-4f16-952b-5e1eb3b0cfa6-kube-api-access-pdlmq" (OuterVolumeSpecName: "kube-api-access-pdlmq") pod "0e07ce44-d586-4f16-952b-5e1eb3b0cfa6" (UID: "0e07ce44-d586-4f16-952b-5e1eb3b0cfa6"). InnerVolumeSpecName "kube-api-access-pdlmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:19 crc kubenswrapper[4696]: I0318 15:57:19.473173 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e07ce44-d586-4f16-952b-5e1eb3b0cfa6-config" (OuterVolumeSpecName: "config") pod "0e07ce44-d586-4f16-952b-5e1eb3b0cfa6" (UID: "0e07ce44-d586-4f16-952b-5e1eb3b0cfa6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:19 crc kubenswrapper[4696]: I0318 15:57:19.477114 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e07ce44-d586-4f16-952b-5e1eb3b0cfa6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e07ce44-d586-4f16-952b-5e1eb3b0cfa6" (UID: "0e07ce44-d586-4f16-952b-5e1eb3b0cfa6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:19 crc kubenswrapper[4696]: I0318 15:57:19.535692 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e07ce44-d586-4f16-952b-5e1eb3b0cfa6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:19 crc kubenswrapper[4696]: I0318 15:57:19.535745 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e07ce44-d586-4f16-952b-5e1eb3b0cfa6-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:19 crc kubenswrapper[4696]: I0318 15:57:19.535755 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdlmq\" (UniqueName: \"kubernetes.io/projected/0e07ce44-d586-4f16-952b-5e1eb3b0cfa6-kube-api-access-pdlmq\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:19 crc kubenswrapper[4696]: I0318 15:57:19.750276 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fxg9t" event={"ID":"527c444b-3209-4c1e-addb-ed9404ab8efd","Type":"ContainerStarted","Data":"b33330338b64f9ea3a53789b59a27ef3b0fc2d23861c97d78add74615715ec20"} Mar 18 15:57:19 crc kubenswrapper[4696]: I0318 15:57:19.794573 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zfkzp" event={"ID":"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617","Type":"ContainerStarted","Data":"a2126385fe875ffff376a0d9a679939b4e646644b2caacd676f6cbcd88de07f6"} Mar 18 15:57:19 crc kubenswrapper[4696]: I0318 15:57:19.795599 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-fxg9t" podStartSLOduration=3.351866475 podStartE2EDuration="32.79557449s" podCreationTimestamp="2026-03-18 15:56:47 +0000 UTC" firstStartedPulling="2026-03-18 15:56:49.724636379 +0000 UTC m=+1252.730810585" lastFinishedPulling="2026-03-18 15:57:19.168344394 +0000 UTC m=+1282.174518600" observedRunningTime="2026-03-18 15:57:19.776168164 +0000 UTC m=+1282.782342370" watchObservedRunningTime="2026-03-18 15:57:19.79557449 +0000 UTC m=+1282.801748696" Mar 18 15:57:19 crc kubenswrapper[4696]: I0318 15:57:19.808802 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" event={"ID":"4416fa4b-8d2a-4580-bdf8-332129f418cf","Type":"ContainerStarted","Data":"bab8b6c28c01002196323615eed6d7aaae8cbb7beabdb753933f6e5aba0fabc0"} Mar 18 15:57:19 crc kubenswrapper[4696]: I0318 15:57:19.808871 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" event={"ID":"4416fa4b-8d2a-4580-bdf8-332129f418cf","Type":"ContainerStarted","Data":"5fb1bd455a41056291ea6e4774770b23d0c85f3a5388b0cf88dc824d57ab99ed"} Mar 18 15:57:19 crc kubenswrapper[4696]: I0318 15:57:19.844111 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-zfkzp" podStartSLOduration=2.627203947 podStartE2EDuration="32.844075606s" podCreationTimestamp="2026-03-18 15:56:47 +0000 UTC" firstStartedPulling="2026-03-18 15:56:49.009255423 +0000 UTC m=+1252.015429629" lastFinishedPulling="2026-03-18 15:57:19.226127082 +0000 UTC m=+1282.232301288" observedRunningTime="2026-03-18 15:57:19.823443309 +0000 UTC m=+1282.829617515" watchObservedRunningTime="2026-03-18 15:57:19.844075606 +0000 UTC m=+1282.850249822" Mar 18 15:57:19 crc kubenswrapper[4696]: I0318 15:57:19.888370 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59764c649b-dxxpb" event={"ID":"abd090d6-037c-4cc7-907a-43293ce636ff","Type":"ContainerStarted","Data":"f872f7a16cb16a56607600c69b9558be12b521b6a49f10ce98c54c53f408a5f3"} Mar 18 15:57:19 crc kubenswrapper[4696]: I0318 15:57:19.891377 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59764c649b-dxxpb" event={"ID":"abd090d6-037c-4cc7-907a-43293ce636ff","Type":"ContainerStarted","Data":"87e03f1072162046cc134d28d92b5153b10795f5c74003465bb2ca3f35d0c12a"} Mar 18 15:57:19 crc kubenswrapper[4696]: I0318 15:57:19.891411 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59764c649b-dxxpb" event={"ID":"abd090d6-037c-4cc7-907a-43293ce636ff","Type":"ContainerStarted","Data":"81e341330257c811eb43bb6411207c00c9872e17ae6a49a4e95c7843a3b02418"} Mar 18 15:57:19 crc kubenswrapper[4696]: I0318 15:57:19.964340 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-f6fkg"] Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:19.989495 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e295c1a-9787-42a3-ac9c-5252bda652b5","Type":"ContainerStarted","Data":"7a723587643d21a0b67f5b2a09f847f0f2f128465d455ccdbee4d6d19cdcaea8"} Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:19.991298 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-nl7sv"] Mar 18 15:57:20 crc kubenswrapper[4696]: E0318 15:57:19.992354 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e07ce44-d586-4f16-952b-5e1eb3b0cfa6" containerName="neutron-db-sync" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:19.992391 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e07ce44-d586-4f16-952b-5e1eb3b0cfa6" containerName="neutron-db-sync" Mar 18 15:57:20 crc kubenswrapper[4696]: E0318 15:57:19.992419 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4151e21-6506-415e-9dbe-3fe4389838b6" containerName="dnsmasq-dns" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:19.992425 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4151e21-6506-415e-9dbe-3fe4389838b6" containerName="dnsmasq-dns" Mar 18 15:57:20 crc kubenswrapper[4696]: E0318 15:57:19.992462 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4151e21-6506-415e-9dbe-3fe4389838b6" containerName="init" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:19.992469 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4151e21-6506-415e-9dbe-3fe4389838b6" containerName="init" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:19.992735 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e07ce44-d586-4f16-952b-5e1eb3b0cfa6" containerName="neutron-db-sync" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:19.992773 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4151e21-6506-415e-9dbe-3fe4389838b6" containerName="dnsmasq-dns" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:19.993010 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-59764c649b-dxxpb" podStartSLOduration=23.992983117 podStartE2EDuration="23.992983117s" podCreationTimestamp="2026-03-18 15:56:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:19.91929718 +0000 UTC m=+1282.925471406" watchObservedRunningTime="2026-03-18 15:57:19.992983117 +0000 UTC m=+1282.999157323" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:19.993828 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.000642 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"204d5c0e-9847-4b8f-be9a-9592da9f389c","Type":"ContainerStarted","Data":"3aaa43f2ea0614bb5824e095078d9fd6d396d62ea4c18af3c3ffa56aa70910c8"} Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.036628 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-nl7sv"] Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.051377 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-696476876d-4rxz2" event={"ID":"72021b21-00cf-4c33-be2d-b24f20dc0f9f","Type":"ContainerStarted","Data":"aa6ce0736d260c0e890babe3e97927ddf6d6dc7f76b06938d567fa9e56ca3420"} Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.051453 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-696476876d-4rxz2" event={"ID":"72021b21-00cf-4c33-be2d-b24f20dc0f9f","Type":"ContainerStarted","Data":"edfbd6cd9033e70288358ec7f95bf3e894b8c09d6b8a9796e6f6a76bce599b5d"} Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.065009 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0800565-af0f-408d-8da5-81341ad3e2af","Type":"ContainerStarted","Data":"46d5927555911b09233f49dc08a1d6f8316a31b5e0a5b454188cf5e273f82ac0"} Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.078433 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jvqtp" event={"ID":"319d79c7-8160-4b57-9b13-3797a015cbdf","Type":"ContainerStarted","Data":"c1de6795c4da3f0414c9c4ed7fa8707ce5a74ed14795eb5ac72927b8f0cea52d"} Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.078496 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jvqtp" event={"ID":"319d79c7-8160-4b57-9b13-3797a015cbdf","Type":"ContainerStarted","Data":"2b54a12aa02c0ea6a6db3be501f14c79386878f9c9ddd217ba7085069c2cc1a3"} Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.088842 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6fd69d9b-g5mkf"] Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.090811 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6fd69d9b-g5mkf" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.093424 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.098057 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-h94vl" event={"ID":"0e07ce44-d586-4f16-952b-5e1eb3b0cfa6","Type":"ContainerDied","Data":"ec0d5bc4f1f681a5ec5198a49bb43a83d91aef0b58c7533ca216c1003ad21e2d"} Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.098099 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec0d5bc4f1f681a5ec5198a49bb43a83d91aef0b58c7533ca216c1003ad21e2d" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.098163 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-h94vl" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.140629 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6fd69d9b-g5mkf"] Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.220346 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-696476876d-4rxz2" podStartSLOduration=23.610526733 podStartE2EDuration="24.220314223s" podCreationTimestamp="2026-03-18 15:56:56 +0000 UTC" firstStartedPulling="2026-03-18 15:57:17.873329405 +0000 UTC m=+1280.879503651" lastFinishedPulling="2026-03-18 15:57:18.483116935 +0000 UTC m=+1281.489291141" observedRunningTime="2026-03-18 15:57:20.088072309 +0000 UTC m=+1283.094246525" watchObservedRunningTime="2026-03-18 15:57:20.220314223 +0000 UTC m=+1283.226488429" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.247353 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-config\") pod \"dnsmasq-dns-6b7b667979-nl7sv\" (UID: \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\") " pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.252031 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-nl7sv\" (UID: \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\") " pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.252289 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-dns-svc\") pod \"dnsmasq-dns-6b7b667979-nl7sv\" (UID: \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\") " pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.252321 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cqlr\" (UniqueName: \"kubernetes.io/projected/8531ccd7-8b34-4e09-8b34-1fcf88234f42-kube-api-access-4cqlr\") pod \"dnsmasq-dns-6b7b667979-nl7sv\" (UID: \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\") " pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.252616 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-nl7sv\" (UID: \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\") " pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.252668 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-nl7sv\" (UID: \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\") " pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.283926 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-jvqtp" podStartSLOduration=13.283901736 podStartE2EDuration="13.283901736s" podCreationTimestamp="2026-03-18 15:57:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:20.121123558 +0000 UTC m=+1283.127297784" watchObservedRunningTime="2026-03-18 15:57:20.283901736 +0000 UTC m=+1283.290075942" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.356862 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-config\") pod \"dnsmasq-dns-6b7b667979-nl7sv\" (UID: \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\") " pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.356959 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-nl7sv\" (UID: \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\") " pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.357003 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-ovndb-tls-certs\") pod \"neutron-6fd69d9b-g5mkf\" (UID: \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\") " pod="openstack/neutron-6fd69d9b-g5mkf" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.357098 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-dns-svc\") pod \"dnsmasq-dns-6b7b667979-nl7sv\" (UID: \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\") " pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.357131 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cqlr\" (UniqueName: \"kubernetes.io/projected/8531ccd7-8b34-4e09-8b34-1fcf88234f42-kube-api-access-4cqlr\") pod \"dnsmasq-dns-6b7b667979-nl7sv\" (UID: \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\") " pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.357175 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-httpd-config\") pod \"neutron-6fd69d9b-g5mkf\" (UID: \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\") " pod="openstack/neutron-6fd69d9b-g5mkf" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.357200 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-combined-ca-bundle\") pod \"neutron-6fd69d9b-g5mkf\" (UID: \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\") " pod="openstack/neutron-6fd69d9b-g5mkf" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.357272 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-nl7sv\" (UID: \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\") " pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.357296 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-nl7sv\" (UID: \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\") " pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.357370 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kklfm\" (UniqueName: \"kubernetes.io/projected/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-kube-api-access-kklfm\") pod \"neutron-6fd69d9b-g5mkf\" (UID: \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\") " pod="openstack/neutron-6fd69d9b-g5mkf" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.358480 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-config\") pod \"neutron-6fd69d9b-g5mkf\" (UID: \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\") " pod="openstack/neutron-6fd69d9b-g5mkf" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.363461 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-dns-svc\") pod \"dnsmasq-dns-6b7b667979-nl7sv\" (UID: \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\") " pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.363622 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-config\") pod \"dnsmasq-dns-6b7b667979-nl7sv\" (UID: \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\") " pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.363884 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-nl7sv\" (UID: \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\") " pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.365847 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-nl7sv\" (UID: \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\") " pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.367162 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-nl7sv\" (UID: \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\") " pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.385068 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cqlr\" (UniqueName: \"kubernetes.io/projected/8531ccd7-8b34-4e09-8b34-1fcf88234f42-kube-api-access-4cqlr\") pod \"dnsmasq-dns-6b7b667979-nl7sv\" (UID: \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\") " pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.461273 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kklfm\" (UniqueName: \"kubernetes.io/projected/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-kube-api-access-kklfm\") pod \"neutron-6fd69d9b-g5mkf\" (UID: \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\") " pod="openstack/neutron-6fd69d9b-g5mkf" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.461365 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-config\") pod \"neutron-6fd69d9b-g5mkf\" (UID: \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\") " pod="openstack/neutron-6fd69d9b-g5mkf" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.461433 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-ovndb-tls-certs\") pod \"neutron-6fd69d9b-g5mkf\" (UID: \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\") " pod="openstack/neutron-6fd69d9b-g5mkf" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.461486 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-httpd-config\") pod \"neutron-6fd69d9b-g5mkf\" (UID: \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\") " pod="openstack/neutron-6fd69d9b-g5mkf" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.461528 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-combined-ca-bundle\") pod \"neutron-6fd69d9b-g5mkf\" (UID: \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\") " pod="openstack/neutron-6fd69d9b-g5mkf" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.465807 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.468207 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-combined-ca-bundle\") pod \"neutron-6fd69d9b-g5mkf\" (UID: \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\") " pod="openstack/neutron-6fd69d9b-g5mkf" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.469450 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-config\") pod \"neutron-6fd69d9b-g5mkf\" (UID: \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\") " pod="openstack/neutron-6fd69d9b-g5mkf" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.471498 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-ovndb-tls-certs\") pod \"neutron-6fd69d9b-g5mkf\" (UID: \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\") " pod="openstack/neutron-6fd69d9b-g5mkf" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.479841 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-httpd-config\") pod \"neutron-6fd69d9b-g5mkf\" (UID: \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\") " pod="openstack/neutron-6fd69d9b-g5mkf" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.486170 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kklfm\" (UniqueName: \"kubernetes.io/projected/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-kube-api-access-kklfm\") pod \"neutron-6fd69d9b-g5mkf\" (UID: \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\") " pod="openstack/neutron-6fd69d9b-g5mkf" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.678716 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" Mar 18 15:57:20 crc kubenswrapper[4696]: I0318 15:57:20.729863 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6fd69d9b-g5mkf" Mar 18 15:57:21 crc kubenswrapper[4696]: I0318 15:57:21.125590 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0800565-af0f-408d-8da5-81341ad3e2af","Type":"ContainerStarted","Data":"bd9db8d474d1dcc68d9bd773bc78267a4e93cb536ef69020e1530fce33747aa7"} Mar 18 15:57:21 crc kubenswrapper[4696]: I0318 15:57:21.150890 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"204d5c0e-9847-4b8f-be9a-9592da9f389c","Type":"ContainerStarted","Data":"837c76e7842bef25a0f34d98b9451254faeeab02a57c534dee64a0ecfa3cbfde"} Mar 18 15:57:21 crc kubenswrapper[4696]: I0318 15:57:21.170307 4696 generic.go:334] "Generic (PLEG): container finished" podID="4416fa4b-8d2a-4580-bdf8-332129f418cf" containerID="bab8b6c28c01002196323615eed6d7aaae8cbb7beabdb753933f6e5aba0fabc0" exitCode=0 Mar 18 15:57:21 crc kubenswrapper[4696]: I0318 15:57:21.171809 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" event={"ID":"4416fa4b-8d2a-4580-bdf8-332129f418cf","Type":"ContainerDied","Data":"bab8b6c28c01002196323615eed6d7aaae8cbb7beabdb753933f6e5aba0fabc0"} Mar 18 15:57:21 crc kubenswrapper[4696]: I0318 15:57:21.366690 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-nl7sv"] Mar 18 15:57:21 crc kubenswrapper[4696]: I0318 15:57:21.674261 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6fd69d9b-g5mkf"] Mar 18 15:57:22 crc kubenswrapper[4696]: I0318 15:57:22.187276 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"204d5c0e-9847-4b8f-be9a-9592da9f389c","Type":"ContainerStarted","Data":"a5dad8c8431f0b556e68e1b923a5c3aca92c3b3661e736a20042a4a880e6b2c2"} Mar 18 15:57:22 crc kubenswrapper[4696]: I0318 15:57:22.187827 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="204d5c0e-9847-4b8f-be9a-9592da9f389c" containerName="glance-log" containerID="cri-o://837c76e7842bef25a0f34d98b9451254faeeab02a57c534dee64a0ecfa3cbfde" gracePeriod=30 Mar 18 15:57:22 crc kubenswrapper[4696]: I0318 15:57:22.188327 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="204d5c0e-9847-4b8f-be9a-9592da9f389c" containerName="glance-httpd" containerID="cri-o://a5dad8c8431f0b556e68e1b923a5c3aca92c3b3661e736a20042a4a880e6b2c2" gracePeriod=30 Mar 18 15:57:22 crc kubenswrapper[4696]: I0318 15:57:22.198047 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fd69d9b-g5mkf" event={"ID":"8a89cfa6-c720-40a1-aaa2-cfe62c153c14","Type":"ContainerStarted","Data":"86ba95dc63897e0d296c5f46330ce41ccaf6f201a25be51b0c5b50d5df98c356"} Mar 18 15:57:22 crc kubenswrapper[4696]: I0318 15:57:22.199475 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" event={"ID":"8531ccd7-8b34-4e09-8b34-1fcf88234f42","Type":"ContainerStarted","Data":"7e337a4206dc70b633f2b9537d4104b7b2860411e89dc5609cbf2b721c487bac"} Mar 18 15:57:22 crc kubenswrapper[4696]: I0318 15:57:22.217167 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=15.217135317 podStartE2EDuration="15.217135317s" podCreationTimestamp="2026-03-18 15:57:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:22.214931392 +0000 UTC m=+1285.221105598" watchObservedRunningTime="2026-03-18 15:57:22.217135317 +0000 UTC m=+1285.223309513" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.155732 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5c74b66c77-jb2xr"] Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.158381 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.163044 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.163363 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.179219 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c74b66c77-jb2xr"] Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.219193 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" event={"ID":"4416fa4b-8d2a-4580-bdf8-332129f418cf","Type":"ContainerStarted","Data":"9db60bca6ddc3775c688566bbc7ee2a257f57e6e24c7fa9d241e10d36bc2e960"} Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.219346 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.219377 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" podUID="4416fa4b-8d2a-4580-bdf8-332129f418cf" containerName="dnsmasq-dns" containerID="cri-o://9db60bca6ddc3775c688566bbc7ee2a257f57e6e24c7fa9d241e10d36bc2e960" gracePeriod=10 Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.238116 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0800565-af0f-408d-8da5-81341ad3e2af","Type":"ContainerStarted","Data":"bb4205e4a94db855d71bbec5b97e1b9fcc69fb30e1481ccf5ade2b9d0ea59c60"} Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.238410 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f0800565-af0f-408d-8da5-81341ad3e2af" containerName="glance-log" containerID="cri-o://bd9db8d474d1dcc68d9bd773bc78267a4e93cb536ef69020e1530fce33747aa7" gracePeriod=30 Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.240764 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f0800565-af0f-408d-8da5-81341ad3e2af" containerName="glance-httpd" containerID="cri-o://bb4205e4a94db855d71bbec5b97e1b9fcc69fb30e1481ccf5ade2b9d0ea59c60" gracePeriod=30 Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.253490 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" podStartSLOduration=16.253462824 podStartE2EDuration="16.253462824s" podCreationTimestamp="2026-03-18 15:57:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:23.245067893 +0000 UTC m=+1286.251242099" watchObservedRunningTime="2026-03-18 15:57:23.253462824 +0000 UTC m=+1286.259637030" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.258872 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fd69d9b-g5mkf" event={"ID":"8a89cfa6-c720-40a1-aaa2-cfe62c153c14","Type":"ContainerStarted","Data":"3ed8baf76939182337510c8296bf423be9f525c652717f15ced34e5b2a0098f6"} Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.258937 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fd69d9b-g5mkf" event={"ID":"8a89cfa6-c720-40a1-aaa2-cfe62c153c14","Type":"ContainerStarted","Data":"1618e915cd969c52f18b76321c8af6b8917af2d7b0bdb02f3cf23c588c6c9e67"} Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.260136 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6fd69d9b-g5mkf" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.281755 4696 generic.go:334] "Generic (PLEG): container finished" podID="8531ccd7-8b34-4e09-8b34-1fcf88234f42" containerID="33e988adcd07011082b7f2e4f41a3ca490cf7b314a1ab44bed9c0d8e17770a42" exitCode=0 Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.281909 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" event={"ID":"8531ccd7-8b34-4e09-8b34-1fcf88234f42","Type":"ContainerDied","Data":"33e988adcd07011082b7f2e4f41a3ca490cf7b314a1ab44bed9c0d8e17770a42"} Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.302101 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=16.301901808 podStartE2EDuration="16.301901808s" podCreationTimestamp="2026-03-18 15:57:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:23.264182642 +0000 UTC m=+1286.270356848" watchObservedRunningTime="2026-03-18 15:57:23.301901808 +0000 UTC m=+1286.308076014" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.328456 4696 generic.go:334] "Generic (PLEG): container finished" podID="204d5c0e-9847-4b8f-be9a-9592da9f389c" containerID="a5dad8c8431f0b556e68e1b923a5c3aca92c3b3661e736a20042a4a880e6b2c2" exitCode=0 Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.328499 4696 generic.go:334] "Generic (PLEG): container finished" podID="204d5c0e-9847-4b8f-be9a-9592da9f389c" containerID="837c76e7842bef25a0f34d98b9451254faeeab02a57c534dee64a0ecfa3cbfde" exitCode=143 Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.328541 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"204d5c0e-9847-4b8f-be9a-9592da9f389c","Type":"ContainerDied","Data":"a5dad8c8431f0b556e68e1b923a5c3aca92c3b3661e736a20042a4a880e6b2c2"} Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.328575 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"204d5c0e-9847-4b8f-be9a-9592da9f389c","Type":"ContainerDied","Data":"837c76e7842bef25a0f34d98b9451254faeeab02a57c534dee64a0ecfa3cbfde"} Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.349783 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-public-tls-certs\") pod \"neutron-5c74b66c77-jb2xr\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.350870 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-config\") pod \"neutron-5c74b66c77-jb2xr\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.359040 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98rrm\" (UniqueName: \"kubernetes.io/projected/2ca71182-1195-450d-a81b-b6db4dff526e-kube-api-access-98rrm\") pod \"neutron-5c74b66c77-jb2xr\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.360173 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-combined-ca-bundle\") pod \"neutron-5c74b66c77-jb2xr\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.360368 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-internal-tls-certs\") pod \"neutron-5c74b66c77-jb2xr\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.360717 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-httpd-config\") pod \"neutron-5c74b66c77-jb2xr\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.360845 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-ovndb-tls-certs\") pod \"neutron-5c74b66c77-jb2xr\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.362828 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6fd69d9b-g5mkf" podStartSLOduration=4.362814774 podStartE2EDuration="4.362814774s" podCreationTimestamp="2026-03-18 15:57:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:23.310082593 +0000 UTC m=+1286.316256809" watchObservedRunningTime="2026-03-18 15:57:23.362814774 +0000 UTC m=+1286.368988980" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.466092 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98rrm\" (UniqueName: \"kubernetes.io/projected/2ca71182-1195-450d-a81b-b6db4dff526e-kube-api-access-98rrm\") pod \"neutron-5c74b66c77-jb2xr\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.466170 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-combined-ca-bundle\") pod \"neutron-5c74b66c77-jb2xr\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.466197 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-internal-tls-certs\") pod \"neutron-5c74b66c77-jb2xr\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.466266 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-httpd-config\") pod \"neutron-5c74b66c77-jb2xr\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.466301 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-ovndb-tls-certs\") pod \"neutron-5c74b66c77-jb2xr\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.466368 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-config\") pod \"neutron-5c74b66c77-jb2xr\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.466387 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-public-tls-certs\") pod \"neutron-5c74b66c77-jb2xr\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.478450 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-ovndb-tls-certs\") pod \"neutron-5c74b66c77-jb2xr\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.478704 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-public-tls-certs\") pod \"neutron-5c74b66c77-jb2xr\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.478915 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-config\") pod \"neutron-5c74b66c77-jb2xr\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.483167 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-httpd-config\") pod \"neutron-5c74b66c77-jb2xr\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.483751 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-internal-tls-certs\") pod \"neutron-5c74b66c77-jb2xr\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.491152 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-combined-ca-bundle\") pod \"neutron-5c74b66c77-jb2xr\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.513056 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98rrm\" (UniqueName: \"kubernetes.io/projected/2ca71182-1195-450d-a81b-b6db4dff526e-kube-api-access-98rrm\") pod \"neutron-5c74b66c77-jb2xr\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.513823 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.736696 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.797239 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/204d5c0e-9847-4b8f-be9a-9592da9f389c-httpd-run\") pod \"204d5c0e-9847-4b8f-be9a-9592da9f389c\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.797299 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"204d5c0e-9847-4b8f-be9a-9592da9f389c\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.797374 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204d5c0e-9847-4b8f-be9a-9592da9f389c-combined-ca-bundle\") pod \"204d5c0e-9847-4b8f-be9a-9592da9f389c\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.797402 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204d5c0e-9847-4b8f-be9a-9592da9f389c-config-data\") pod \"204d5c0e-9847-4b8f-be9a-9592da9f389c\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.797443 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/204d5c0e-9847-4b8f-be9a-9592da9f389c-scripts\") pod \"204d5c0e-9847-4b8f-be9a-9592da9f389c\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.797559 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hcdw\" (UniqueName: \"kubernetes.io/projected/204d5c0e-9847-4b8f-be9a-9592da9f389c-kube-api-access-4hcdw\") pod \"204d5c0e-9847-4b8f-be9a-9592da9f389c\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.797595 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/204d5c0e-9847-4b8f-be9a-9592da9f389c-logs\") pod \"204d5c0e-9847-4b8f-be9a-9592da9f389c\" (UID: \"204d5c0e-9847-4b8f-be9a-9592da9f389c\") " Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.803223 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/204d5c0e-9847-4b8f-be9a-9592da9f389c-logs" (OuterVolumeSpecName: "logs") pod "204d5c0e-9847-4b8f-be9a-9592da9f389c" (UID: "204d5c0e-9847-4b8f-be9a-9592da9f389c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.808894 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/204d5c0e-9847-4b8f-be9a-9592da9f389c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "204d5c0e-9847-4b8f-be9a-9592da9f389c" (UID: "204d5c0e-9847-4b8f-be9a-9592da9f389c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.818786 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/204d5c0e-9847-4b8f-be9a-9592da9f389c-scripts" (OuterVolumeSpecName: "scripts") pod "204d5c0e-9847-4b8f-be9a-9592da9f389c" (UID: "204d5c0e-9847-4b8f-be9a-9592da9f389c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.818822 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "204d5c0e-9847-4b8f-be9a-9592da9f389c" (UID: "204d5c0e-9847-4b8f-be9a-9592da9f389c"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.821608 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/204d5c0e-9847-4b8f-be9a-9592da9f389c-kube-api-access-4hcdw" (OuterVolumeSpecName: "kube-api-access-4hcdw") pod "204d5c0e-9847-4b8f-be9a-9592da9f389c" (UID: "204d5c0e-9847-4b8f-be9a-9592da9f389c"). InnerVolumeSpecName "kube-api-access-4hcdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.865689 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.901267 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hcdw\" (UniqueName: \"kubernetes.io/projected/204d5c0e-9847-4b8f-be9a-9592da9f389c-kube-api-access-4hcdw\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.901317 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/204d5c0e-9847-4b8f-be9a-9592da9f389c-logs\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.901333 4696 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/204d5c0e-9847-4b8f-be9a-9592da9f389c-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.901372 4696 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.901387 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/204d5c0e-9847-4b8f-be9a-9592da9f389c-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.947278 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/204d5c0e-9847-4b8f-be9a-9592da9f389c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "204d5c0e-9847-4b8f-be9a-9592da9f389c" (UID: "204d5c0e-9847-4b8f-be9a-9592da9f389c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.950407 4696 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 18 15:57:23 crc kubenswrapper[4696]: I0318 15:57:23.995712 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/204d5c0e-9847-4b8f-be9a-9592da9f389c-config-data" (OuterVolumeSpecName: "config-data") pod "204d5c0e-9847-4b8f-be9a-9592da9f389c" (UID: "204d5c0e-9847-4b8f-be9a-9592da9f389c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.009507 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt4m8\" (UniqueName: \"kubernetes.io/projected/4416fa4b-8d2a-4580-bdf8-332129f418cf-kube-api-access-qt4m8\") pod \"4416fa4b-8d2a-4580-bdf8-332129f418cf\" (UID: \"4416fa4b-8d2a-4580-bdf8-332129f418cf\") " Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.009787 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-dns-swift-storage-0\") pod \"4416fa4b-8d2a-4580-bdf8-332129f418cf\" (UID: \"4416fa4b-8d2a-4580-bdf8-332129f418cf\") " Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.009931 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-config\") pod \"4416fa4b-8d2a-4580-bdf8-332129f418cf\" (UID: \"4416fa4b-8d2a-4580-bdf8-332129f418cf\") " Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.010088 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-ovsdbserver-sb\") pod \"4416fa4b-8d2a-4580-bdf8-332129f418cf\" (UID: \"4416fa4b-8d2a-4580-bdf8-332129f418cf\") " Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.010242 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-dns-svc\") pod \"4416fa4b-8d2a-4580-bdf8-332129f418cf\" (UID: \"4416fa4b-8d2a-4580-bdf8-332129f418cf\") " Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.010271 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-ovsdbserver-nb\") pod \"4416fa4b-8d2a-4580-bdf8-332129f418cf\" (UID: \"4416fa4b-8d2a-4580-bdf8-332129f418cf\") " Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.010857 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204d5c0e-9847-4b8f-be9a-9592da9f389c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.010876 4696 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.010890 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204d5c0e-9847-4b8f-be9a-9592da9f389c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.027970 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4416fa4b-8d2a-4580-bdf8-332129f418cf-kube-api-access-qt4m8" (OuterVolumeSpecName: "kube-api-access-qt4m8") pod "4416fa4b-8d2a-4580-bdf8-332129f418cf" (UID: "4416fa4b-8d2a-4580-bdf8-332129f418cf"). InnerVolumeSpecName "kube-api-access-qt4m8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.089880 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-config" (OuterVolumeSpecName: "config") pod "4416fa4b-8d2a-4580-bdf8-332129f418cf" (UID: "4416fa4b-8d2a-4580-bdf8-332129f418cf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.108457 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4416fa4b-8d2a-4580-bdf8-332129f418cf" (UID: "4416fa4b-8d2a-4580-bdf8-332129f418cf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.113284 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.113322 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.113341 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt4m8\" (UniqueName: \"kubernetes.io/projected/4416fa4b-8d2a-4580-bdf8-332129f418cf-kube-api-access-qt4m8\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.118080 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4416fa4b-8d2a-4580-bdf8-332129f418cf" (UID: "4416fa4b-8d2a-4580-bdf8-332129f418cf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.149210 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4416fa4b-8d2a-4580-bdf8-332129f418cf" (UID: "4416fa4b-8d2a-4580-bdf8-332129f418cf"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.214916 4696 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.214959 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.220774 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4416fa4b-8d2a-4580-bdf8-332129f418cf" (UID: "4416fa4b-8d2a-4580-bdf8-332129f418cf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.318874 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4416fa4b-8d2a-4580-bdf8-332129f418cf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.369746 4696 generic.go:334] "Generic (PLEG): container finished" podID="f0800565-af0f-408d-8da5-81341ad3e2af" containerID="bb4205e4a94db855d71bbec5b97e1b9fcc69fb30e1481ccf5ade2b9d0ea59c60" exitCode=0 Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.369784 4696 generic.go:334] "Generic (PLEG): container finished" podID="f0800565-af0f-408d-8da5-81341ad3e2af" containerID="bd9db8d474d1dcc68d9bd773bc78267a4e93cb536ef69020e1530fce33747aa7" exitCode=143 Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.369840 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0800565-af0f-408d-8da5-81341ad3e2af","Type":"ContainerDied","Data":"bb4205e4a94db855d71bbec5b97e1b9fcc69fb30e1481ccf5ade2b9d0ea59c60"} Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.369869 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0800565-af0f-408d-8da5-81341ad3e2af","Type":"ContainerDied","Data":"bd9db8d474d1dcc68d9bd773bc78267a4e93cb536ef69020e1530fce33747aa7"} Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.392649 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" event={"ID":"8531ccd7-8b34-4e09-8b34-1fcf88234f42","Type":"ContainerStarted","Data":"4d6ef86954abcda51f97b199a55ccc724445812ec71a1b1ae6b413fff466328f"} Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.392821 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.402657 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"204d5c0e-9847-4b8f-be9a-9592da9f389c","Type":"ContainerDied","Data":"3aaa43f2ea0614bb5824e095078d9fd6d396d62ea4c18af3c3ffa56aa70910c8"} Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.402720 4696 scope.go:117] "RemoveContainer" containerID="a5dad8c8431f0b556e68e1b923a5c3aca92c3b3661e736a20042a4a880e6b2c2" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.402880 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.424349 4696 generic.go:334] "Generic (PLEG): container finished" podID="4416fa4b-8d2a-4580-bdf8-332129f418cf" containerID="9db60bca6ddc3775c688566bbc7ee2a257f57e6e24c7fa9d241e10d36bc2e960" exitCode=0 Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.424423 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" event={"ID":"4416fa4b-8d2a-4580-bdf8-332129f418cf","Type":"ContainerDied","Data":"9db60bca6ddc3775c688566bbc7ee2a257f57e6e24c7fa9d241e10d36bc2e960"} Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.426814 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" event={"ID":"4416fa4b-8d2a-4580-bdf8-332129f418cf","Type":"ContainerDied","Data":"5fb1bd455a41056291ea6e4774770b23d0c85f3a5388b0cf88dc824d57ab99ed"} Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.424497 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-f6fkg" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.429840 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" podStartSLOduration=5.42982084 podStartE2EDuration="5.42982084s" podCreationTimestamp="2026-03-18 15:57:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:24.424723992 +0000 UTC m=+1287.430898198" watchObservedRunningTime="2026-03-18 15:57:24.42982084 +0000 UTC m=+1287.435995046" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.471130 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c74b66c77-jb2xr"] Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.622537 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.627464 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.650267 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.654932 4696 scope.go:117] "RemoveContainer" containerID="837c76e7842bef25a0f34d98b9451254faeeab02a57c534dee64a0ecfa3cbfde" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.659666 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-f6fkg"] Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.688565 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0800565-af0f-408d-8da5-81341ad3e2af-config-data\") pod \"f0800565-af0f-408d-8da5-81341ad3e2af\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.688684 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0800565-af0f-408d-8da5-81341ad3e2af-logs\") pod \"f0800565-af0f-408d-8da5-81341ad3e2af\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.688779 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0800565-af0f-408d-8da5-81341ad3e2af-combined-ca-bundle\") pod \"f0800565-af0f-408d-8da5-81341ad3e2af\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.688867 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89frh\" (UniqueName: \"kubernetes.io/projected/f0800565-af0f-408d-8da5-81341ad3e2af-kube-api-access-89frh\") pod \"f0800565-af0f-408d-8da5-81341ad3e2af\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.688897 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"f0800565-af0f-408d-8da5-81341ad3e2af\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.688951 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0800565-af0f-408d-8da5-81341ad3e2af-scripts\") pod \"f0800565-af0f-408d-8da5-81341ad3e2af\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.689020 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0800565-af0f-408d-8da5-81341ad3e2af-httpd-run\") pod \"f0800565-af0f-408d-8da5-81341ad3e2af\" (UID: \"f0800565-af0f-408d-8da5-81341ad3e2af\") " Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.692079 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0800565-af0f-408d-8da5-81341ad3e2af-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f0800565-af0f-408d-8da5-81341ad3e2af" (UID: "f0800565-af0f-408d-8da5-81341ad3e2af"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.695485 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-f6fkg"] Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.700570 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0800565-af0f-408d-8da5-81341ad3e2af-logs" (OuterVolumeSpecName: "logs") pod "f0800565-af0f-408d-8da5-81341ad3e2af" (UID: "f0800565-af0f-408d-8da5-81341ad3e2af"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.704333 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 15:57:24 crc kubenswrapper[4696]: E0318 15:57:24.704793 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4416fa4b-8d2a-4580-bdf8-332129f418cf" containerName="init" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.704812 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="4416fa4b-8d2a-4580-bdf8-332129f418cf" containerName="init" Mar 18 15:57:24 crc kubenswrapper[4696]: E0318 15:57:24.704831 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="204d5c0e-9847-4b8f-be9a-9592da9f389c" containerName="glance-log" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.704839 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="204d5c0e-9847-4b8f-be9a-9592da9f389c" containerName="glance-log" Mar 18 15:57:24 crc kubenswrapper[4696]: E0318 15:57:24.704857 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4416fa4b-8d2a-4580-bdf8-332129f418cf" containerName="dnsmasq-dns" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.704863 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="4416fa4b-8d2a-4580-bdf8-332129f418cf" containerName="dnsmasq-dns" Mar 18 15:57:24 crc kubenswrapper[4696]: E0318 15:57:24.704880 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0800565-af0f-408d-8da5-81341ad3e2af" containerName="glance-log" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.704886 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0800565-af0f-408d-8da5-81341ad3e2af" containerName="glance-log" Mar 18 15:57:24 crc kubenswrapper[4696]: E0318 15:57:24.704903 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="204d5c0e-9847-4b8f-be9a-9592da9f389c" containerName="glance-httpd" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.704910 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="204d5c0e-9847-4b8f-be9a-9592da9f389c" containerName="glance-httpd" Mar 18 15:57:24 crc kubenswrapper[4696]: E0318 15:57:24.704933 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0800565-af0f-408d-8da5-81341ad3e2af" containerName="glance-httpd" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.704939 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0800565-af0f-408d-8da5-81341ad3e2af" containerName="glance-httpd" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.705097 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="204d5c0e-9847-4b8f-be9a-9592da9f389c" containerName="glance-log" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.705126 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="204d5c0e-9847-4b8f-be9a-9592da9f389c" containerName="glance-httpd" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.705148 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0800565-af0f-408d-8da5-81341ad3e2af" containerName="glance-log" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.705164 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0800565-af0f-408d-8da5-81341ad3e2af" containerName="glance-httpd" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.705185 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="4416fa4b-8d2a-4580-bdf8-332129f418cf" containerName="dnsmasq-dns" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.710630 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.721053 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0800565-af0f-408d-8da5-81341ad3e2af-scripts" (OuterVolumeSpecName: "scripts") pod "f0800565-af0f-408d-8da5-81341ad3e2af" (UID: "f0800565-af0f-408d-8da5-81341ad3e2af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.721182 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.737224 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0800565-af0f-408d-8da5-81341ad3e2af-kube-api-access-89frh" (OuterVolumeSpecName: "kube-api-access-89frh") pod "f0800565-af0f-408d-8da5-81341ad3e2af" (UID: "f0800565-af0f-408d-8da5-81341ad3e2af"). InnerVolumeSpecName "kube-api-access-89frh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.737451 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "f0800565-af0f-408d-8da5-81341ad3e2af" (UID: "f0800565-af0f-408d-8da5-81341ad3e2af"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.737864 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.745083 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.766951 4696 scope.go:117] "RemoveContainer" containerID="9db60bca6ddc3775c688566bbc7ee2a257f57e6e24c7fa9d241e10d36bc2e960" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.786679 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0800565-af0f-408d-8da5-81341ad3e2af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0800565-af0f-408d-8da5-81341ad3e2af" (UID: "f0800565-af0f-408d-8da5-81341ad3e2af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.792935 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/190416e8-e777-4ec5-b017-8b28d749252e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.792989 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/190416e8-e777-4ec5-b017-8b28d749252e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.793014 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/190416e8-e777-4ec5-b017-8b28d749252e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.793035 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/190416e8-e777-4ec5-b017-8b28d749252e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.793061 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.793094 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdntd\" (UniqueName: \"kubernetes.io/projected/190416e8-e777-4ec5-b017-8b28d749252e-kube-api-access-hdntd\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.793115 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/190416e8-e777-4ec5-b017-8b28d749252e-logs\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.793144 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/190416e8-e777-4ec5-b017-8b28d749252e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.793219 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0800565-af0f-408d-8da5-81341ad3e2af-logs\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.793231 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0800565-af0f-408d-8da5-81341ad3e2af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.793243 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89frh\" (UniqueName: \"kubernetes.io/projected/f0800565-af0f-408d-8da5-81341ad3e2af-kube-api-access-89frh\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.793263 4696 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.793272 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0800565-af0f-408d-8da5-81341ad3e2af-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.793285 4696 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0800565-af0f-408d-8da5-81341ad3e2af-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.813320 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0800565-af0f-408d-8da5-81341ad3e2af-config-data" (OuterVolumeSpecName: "config-data") pod "f0800565-af0f-408d-8da5-81341ad3e2af" (UID: "f0800565-af0f-408d-8da5-81341ad3e2af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.827232 4696 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 18 15:57:24 crc kubenswrapper[4696]: E0318 15:57:24.848719 4696 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod204d5c0e_9847_4b8f_be9a_9592da9f389c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4416fa4b_8d2a_4580_bdf8_332129f418cf.slice\": RecentStats: unable to find data in memory cache]" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.869084 4696 scope.go:117] "RemoveContainer" containerID="bab8b6c28c01002196323615eed6d7aaae8cbb7beabdb753933f6e5aba0fabc0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.895476 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/190416e8-e777-4ec5-b017-8b28d749252e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.895585 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/190416e8-e777-4ec5-b017-8b28d749252e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.895614 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/190416e8-e777-4ec5-b017-8b28d749252e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.895637 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/190416e8-e777-4ec5-b017-8b28d749252e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.895676 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.895721 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdntd\" (UniqueName: \"kubernetes.io/projected/190416e8-e777-4ec5-b017-8b28d749252e-kube-api-access-hdntd\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.895744 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/190416e8-e777-4ec5-b017-8b28d749252e-logs\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.895777 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/190416e8-e777-4ec5-b017-8b28d749252e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.895836 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0800565-af0f-408d-8da5-81341ad3e2af-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.895847 4696 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.897883 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.898458 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/190416e8-e777-4ec5-b017-8b28d749252e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.899289 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/190416e8-e777-4ec5-b017-8b28d749252e-logs\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.911766 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/190416e8-e777-4ec5-b017-8b28d749252e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.916132 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/190416e8-e777-4ec5-b017-8b28d749252e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.916607 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/190416e8-e777-4ec5-b017-8b28d749252e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.923927 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/190416e8-e777-4ec5-b017-8b28d749252e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.929830 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdntd\" (UniqueName: \"kubernetes.io/projected/190416e8-e777-4ec5-b017-8b28d749252e-kube-api-access-hdntd\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.936676 4696 scope.go:117] "RemoveContainer" containerID="9db60bca6ddc3775c688566bbc7ee2a257f57e6e24c7fa9d241e10d36bc2e960" Mar 18 15:57:24 crc kubenswrapper[4696]: E0318 15:57:24.938760 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9db60bca6ddc3775c688566bbc7ee2a257f57e6e24c7fa9d241e10d36bc2e960\": container with ID starting with 9db60bca6ddc3775c688566bbc7ee2a257f57e6e24c7fa9d241e10d36bc2e960 not found: ID does not exist" containerID="9db60bca6ddc3775c688566bbc7ee2a257f57e6e24c7fa9d241e10d36bc2e960" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.938807 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db60bca6ddc3775c688566bbc7ee2a257f57e6e24c7fa9d241e10d36bc2e960"} err="failed to get container status \"9db60bca6ddc3775c688566bbc7ee2a257f57e6e24c7fa9d241e10d36bc2e960\": rpc error: code = NotFound desc = could not find container \"9db60bca6ddc3775c688566bbc7ee2a257f57e6e24c7fa9d241e10d36bc2e960\": container with ID starting with 9db60bca6ddc3775c688566bbc7ee2a257f57e6e24c7fa9d241e10d36bc2e960 not found: ID does not exist" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.938845 4696 scope.go:117] "RemoveContainer" containerID="bab8b6c28c01002196323615eed6d7aaae8cbb7beabdb753933f6e5aba0fabc0" Mar 18 15:57:24 crc kubenswrapper[4696]: E0318 15:57:24.945101 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bab8b6c28c01002196323615eed6d7aaae8cbb7beabdb753933f6e5aba0fabc0\": container with ID starting with bab8b6c28c01002196323615eed6d7aaae8cbb7beabdb753933f6e5aba0fabc0 not found: ID does not exist" containerID="bab8b6c28c01002196323615eed6d7aaae8cbb7beabdb753933f6e5aba0fabc0" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.945146 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bab8b6c28c01002196323615eed6d7aaae8cbb7beabdb753933f6e5aba0fabc0"} err="failed to get container status \"bab8b6c28c01002196323615eed6d7aaae8cbb7beabdb753933f6e5aba0fabc0\": rpc error: code = NotFound desc = could not find container \"bab8b6c28c01002196323615eed6d7aaae8cbb7beabdb753933f6e5aba0fabc0\": container with ID starting with bab8b6c28c01002196323615eed6d7aaae8cbb7beabdb753933f6e5aba0fabc0 not found: ID does not exist" Mar 18 15:57:24 crc kubenswrapper[4696]: I0318 15:57:24.967746 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.129394 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.496195 4696 generic.go:334] "Generic (PLEG): container finished" podID="319d79c7-8160-4b57-9b13-3797a015cbdf" containerID="c1de6795c4da3f0414c9c4ed7fa8707ce5a74ed14795eb5ac72927b8f0cea52d" exitCode=0 Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.497145 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jvqtp" event={"ID":"319d79c7-8160-4b57-9b13-3797a015cbdf","Type":"ContainerDied","Data":"c1de6795c4da3f0414c9c4ed7fa8707ce5a74ed14795eb5ac72927b8f0cea52d"} Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.510542 4696 generic.go:334] "Generic (PLEG): container finished" podID="4f27b4c3-3df4-4f88-9bf1-b0f4c242d617" containerID="a2126385fe875ffff376a0d9a679939b4e646644b2caacd676f6cbcd88de07f6" exitCode=0 Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.510625 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zfkzp" event={"ID":"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617","Type":"ContainerDied","Data":"a2126385fe875ffff376a0d9a679939b4e646644b2caacd676f6cbcd88de07f6"} Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.556824 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0800565-af0f-408d-8da5-81341ad3e2af","Type":"ContainerDied","Data":"46d5927555911b09233f49dc08a1d6f8316a31b5e0a5b454188cf5e273f82ac0"} Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.556914 4696 scope.go:117] "RemoveContainer" containerID="bb4205e4a94db855d71bbec5b97e1b9fcc69fb30e1481ccf5ade2b9d0ea59c60" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.556984 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.581432 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c74b66c77-jb2xr" event={"ID":"2ca71182-1195-450d-a81b-b6db4dff526e","Type":"ContainerStarted","Data":"ac2b44742300d1eff1d4804b2a7d6d37a063c52cd9d0dc98db77042e8e099615"} Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.581501 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c74b66c77-jb2xr" event={"ID":"2ca71182-1195-450d-a81b-b6db4dff526e","Type":"ContainerStarted","Data":"9185425688b7b2eb16da486248052e4f66191948466320dfa616f676c6d38da4"} Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.581513 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c74b66c77-jb2xr" event={"ID":"2ca71182-1195-450d-a81b-b6db4dff526e","Type":"ContainerStarted","Data":"b0a48473c7224e929b493d6003ffc7be6a78bfd513e4bde79181d3cbb945b700"} Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.581570 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.620885 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="204d5c0e-9847-4b8f-be9a-9592da9f389c" path="/var/lib/kubelet/pods/204d5c0e-9847-4b8f-be9a-9592da9f389c/volumes" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.622716 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4416fa4b-8d2a-4580-bdf8-332129f418cf" path="/var/lib/kubelet/pods/4416fa4b-8d2a-4580-bdf8-332129f418cf/volumes" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.625240 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.637196 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.647077 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5c74b66c77-jb2xr" podStartSLOduration=2.64705082 podStartE2EDuration="2.64705082s" podCreationTimestamp="2026-03-18 15:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:25.632015443 +0000 UTC m=+1288.638189649" watchObservedRunningTime="2026-03-18 15:57:25.64705082 +0000 UTC m=+1288.653225026" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.681932 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.699386 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.704896 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.705996 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.744194 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.820568 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.827796 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76a6f156-f710-4c15-a20f-649b27d7e7d6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.827908 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcwnw\" (UniqueName: \"kubernetes.io/projected/76a6f156-f710-4c15-a20f-649b27d7e7d6-kube-api-access-fcwnw\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.827992 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76a6f156-f710-4c15-a20f-649b27d7e7d6-config-data\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.828054 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76a6f156-f710-4c15-a20f-649b27d7e7d6-logs\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.828099 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.828147 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/76a6f156-f710-4c15-a20f-649b27d7e7d6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.828218 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76a6f156-f710-4c15-a20f-649b27d7e7d6-scripts\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.828326 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76a6f156-f710-4c15-a20f-649b27d7e7d6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.930686 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76a6f156-f710-4c15-a20f-649b27d7e7d6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.930795 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcwnw\" (UniqueName: \"kubernetes.io/projected/76a6f156-f710-4c15-a20f-649b27d7e7d6-kube-api-access-fcwnw\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.930856 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76a6f156-f710-4c15-a20f-649b27d7e7d6-config-data\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.930908 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76a6f156-f710-4c15-a20f-649b27d7e7d6-logs\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.930950 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.930989 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/76a6f156-f710-4c15-a20f-649b27d7e7d6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.931035 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76a6f156-f710-4c15-a20f-649b27d7e7d6-scripts\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.931080 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76a6f156-f710-4c15-a20f-649b27d7e7d6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.932258 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.932356 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76a6f156-f710-4c15-a20f-649b27d7e7d6-logs\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.932364 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/76a6f156-f710-4c15-a20f-649b27d7e7d6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.941317 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76a6f156-f710-4c15-a20f-649b27d7e7d6-scripts\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.952497 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76a6f156-f710-4c15-a20f-649b27d7e7d6-config-data\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.957289 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76a6f156-f710-4c15-a20f-649b27d7e7d6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.958186 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcwnw\" (UniqueName: \"kubernetes.io/projected/76a6f156-f710-4c15-a20f-649b27d7e7d6-kube-api-access-fcwnw\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.964015 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76a6f156-f710-4c15-a20f-649b27d7e7d6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:25 crc kubenswrapper[4696]: I0318 15:57:25.973763 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " pod="openstack/glance-default-external-api-0" Mar 18 15:57:26 crc kubenswrapper[4696]: I0318 15:57:26.035245 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 15:57:26 crc kubenswrapper[4696]: I0318 15:57:26.448342 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:57:26 crc kubenswrapper[4696]: I0318 15:57:26.448803 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:57:26 crc kubenswrapper[4696]: I0318 15:57:26.593447 4696 generic.go:334] "Generic (PLEG): container finished" podID="527c444b-3209-4c1e-addb-ed9404ab8efd" containerID="b33330338b64f9ea3a53789b59a27ef3b0fc2d23861c97d78add74615715ec20" exitCode=0 Mar 18 15:57:26 crc kubenswrapper[4696]: I0318 15:57:26.593641 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fxg9t" event={"ID":"527c444b-3209-4c1e-addb-ed9404ab8efd","Type":"ContainerDied","Data":"b33330338b64f9ea3a53789b59a27ef3b0fc2d23861c97d78add74615715ec20"} Mar 18 15:57:26 crc kubenswrapper[4696]: I0318 15:57:26.595849 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:57:26 crc kubenswrapper[4696]: I0318 15:57:26.596457 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:57:27 crc kubenswrapper[4696]: I0318 15:57:27.621791 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0800565-af0f-408d-8da5-81341ad3e2af" path="/var/lib/kubelet/pods/f0800565-af0f-408d-8da5-81341ad3e2af/volumes" Mar 18 15:57:27 crc kubenswrapper[4696]: W0318 15:57:27.899687 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod190416e8_e777_4ec5_b017_8b28d749252e.slice/crio-3d04bd896c066070599c5d03286bb0dea2a3217879ef90a04f5956fc7d6c0579 WatchSource:0}: Error finding container 3d04bd896c066070599c5d03286bb0dea2a3217879ef90a04f5956fc7d6c0579: Status 404 returned error can't find the container with id 3d04bd896c066070599c5d03286bb0dea2a3217879ef90a04f5956fc7d6c0579 Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.001356 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jvqtp" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.008034 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zfkzp" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.079870 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-combined-ca-bundle\") pod \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\" (UID: \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\") " Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.080639 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-config-data\") pod \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\" (UID: \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\") " Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.080774 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-config-data\") pod \"319d79c7-8160-4b57-9b13-3797a015cbdf\" (UID: \"319d79c7-8160-4b57-9b13-3797a015cbdf\") " Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.080847 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-scripts\") pod \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\" (UID: \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\") " Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.081012 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-combined-ca-bundle\") pod \"319d79c7-8160-4b57-9b13-3797a015cbdf\" (UID: \"319d79c7-8160-4b57-9b13-3797a015cbdf\") " Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.081128 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckz6z\" (UniqueName: \"kubernetes.io/projected/319d79c7-8160-4b57-9b13-3797a015cbdf-kube-api-access-ckz6z\") pod \"319d79c7-8160-4b57-9b13-3797a015cbdf\" (UID: \"319d79c7-8160-4b57-9b13-3797a015cbdf\") " Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.081213 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-credential-keys\") pod \"319d79c7-8160-4b57-9b13-3797a015cbdf\" (UID: \"319d79c7-8160-4b57-9b13-3797a015cbdf\") " Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.081282 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-fernet-keys\") pod \"319d79c7-8160-4b57-9b13-3797a015cbdf\" (UID: \"319d79c7-8160-4b57-9b13-3797a015cbdf\") " Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.081354 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-scripts\") pod \"319d79c7-8160-4b57-9b13-3797a015cbdf\" (UID: \"319d79c7-8160-4b57-9b13-3797a015cbdf\") " Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.081599 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78d2r\" (UniqueName: \"kubernetes.io/projected/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-kube-api-access-78d2r\") pod \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\" (UID: \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\") " Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.081708 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-logs\") pod \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\" (UID: \"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617\") " Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.082873 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-logs" (OuterVolumeSpecName: "logs") pod "4f27b4c3-3df4-4f88-9bf1-b0f4c242d617" (UID: "4f27b4c3-3df4-4f88-9bf1-b0f4c242d617"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.095348 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/319d79c7-8160-4b57-9b13-3797a015cbdf-kube-api-access-ckz6z" (OuterVolumeSpecName: "kube-api-access-ckz6z") pod "319d79c7-8160-4b57-9b13-3797a015cbdf" (UID: "319d79c7-8160-4b57-9b13-3797a015cbdf"). InnerVolumeSpecName "kube-api-access-ckz6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.107488 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "319d79c7-8160-4b57-9b13-3797a015cbdf" (UID: "319d79c7-8160-4b57-9b13-3797a015cbdf"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.108410 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-kube-api-access-78d2r" (OuterVolumeSpecName: "kube-api-access-78d2r") pod "4f27b4c3-3df4-4f88-9bf1-b0f4c242d617" (UID: "4f27b4c3-3df4-4f88-9bf1-b0f4c242d617"). InnerVolumeSpecName "kube-api-access-78d2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.110744 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "319d79c7-8160-4b57-9b13-3797a015cbdf" (UID: "319d79c7-8160-4b57-9b13-3797a015cbdf"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.116826 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-scripts" (OuterVolumeSpecName: "scripts") pod "319d79c7-8160-4b57-9b13-3797a015cbdf" (UID: "319d79c7-8160-4b57-9b13-3797a015cbdf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.120902 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-scripts" (OuterVolumeSpecName: "scripts") pod "4f27b4c3-3df4-4f88-9bf1-b0f4c242d617" (UID: "4f27b4c3-3df4-4f88-9bf1-b0f4c242d617"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.132550 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "319d79c7-8160-4b57-9b13-3797a015cbdf" (UID: "319d79c7-8160-4b57-9b13-3797a015cbdf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.136843 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-config-data" (OuterVolumeSpecName: "config-data") pod "4f27b4c3-3df4-4f88-9bf1-b0f4c242d617" (UID: "4f27b4c3-3df4-4f88-9bf1-b0f4c242d617"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.138883 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-config-data" (OuterVolumeSpecName: "config-data") pod "319d79c7-8160-4b57-9b13-3797a015cbdf" (UID: "319d79c7-8160-4b57-9b13-3797a015cbdf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.143141 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f27b4c3-3df4-4f88-9bf1-b0f4c242d617" (UID: "4f27b4c3-3df4-4f88-9bf1-b0f4c242d617"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.184710 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78d2r\" (UniqueName: \"kubernetes.io/projected/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-kube-api-access-78d2r\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.185029 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-logs\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.185098 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.185157 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.185211 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.185266 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.185328 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.185390 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckz6z\" (UniqueName: \"kubernetes.io/projected/319d79c7-8160-4b57-9b13-3797a015cbdf-kube-api-access-ckz6z\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.185455 4696 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.185511 4696 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.185582 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/319d79c7-8160-4b57-9b13-3797a015cbdf-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.628256 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"190416e8-e777-4ec5-b017-8b28d749252e","Type":"ContainerStarted","Data":"3d04bd896c066070599c5d03286bb0dea2a3217879ef90a04f5956fc7d6c0579"} Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.630770 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zfkzp" event={"ID":"4f27b4c3-3df4-4f88-9bf1-b0f4c242d617","Type":"ContainerDied","Data":"970b3e113be70b3aafb79b6792088adcbcfc00e95855cff5f0dfa099beacc011"} Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.630966 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="970b3e113be70b3aafb79b6792088adcbcfc00e95855cff5f0dfa099beacc011" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.630799 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zfkzp" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.633181 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jvqtp" event={"ID":"319d79c7-8160-4b57-9b13-3797a015cbdf","Type":"ContainerDied","Data":"2b54a12aa02c0ea6a6db3be501f14c79386878f9c9ddd217ba7085069c2cc1a3"} Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.633247 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b54a12aa02c0ea6a6db3be501f14c79386878f9c9ddd217ba7085069c2cc1a3" Mar 18 15:57:28 crc kubenswrapper[4696]: I0318 15:57:28.633355 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jvqtp" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.109928 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b5955bfd6-zmfrz"] Mar 18 15:57:29 crc kubenswrapper[4696]: E0318 15:57:29.121681 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f27b4c3-3df4-4f88-9bf1-b0f4c242d617" containerName="placement-db-sync" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.121726 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f27b4c3-3df4-4f88-9bf1-b0f4c242d617" containerName="placement-db-sync" Mar 18 15:57:29 crc kubenswrapper[4696]: E0318 15:57:29.121743 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="319d79c7-8160-4b57-9b13-3797a015cbdf" containerName="keystone-bootstrap" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.121750 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="319d79c7-8160-4b57-9b13-3797a015cbdf" containerName="keystone-bootstrap" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.122040 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="319d79c7-8160-4b57-9b13-3797a015cbdf" containerName="keystone-bootstrap" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.122065 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f27b4c3-3df4-4f88-9bf1-b0f4c242d617" containerName="placement-db-sync" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.122635 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b5955bfd6-zmfrz"] Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.122732 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.137380 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.137869 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.137974 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pdpjw" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.138098 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.138181 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.138369 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.204927 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/811e96fe-c7fe-424f-b86f-043aaa273d62-config-data\") pod \"keystone-b5955bfd6-zmfrz\" (UID: \"811e96fe-c7fe-424f-b86f-043aaa273d62\") " pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.204994 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/811e96fe-c7fe-424f-b86f-043aaa273d62-public-tls-certs\") pod \"keystone-b5955bfd6-zmfrz\" (UID: \"811e96fe-c7fe-424f-b86f-043aaa273d62\") " pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.205139 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds55z\" (UniqueName: \"kubernetes.io/projected/811e96fe-c7fe-424f-b86f-043aaa273d62-kube-api-access-ds55z\") pod \"keystone-b5955bfd6-zmfrz\" (UID: \"811e96fe-c7fe-424f-b86f-043aaa273d62\") " pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.205170 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/811e96fe-c7fe-424f-b86f-043aaa273d62-fernet-keys\") pod \"keystone-b5955bfd6-zmfrz\" (UID: \"811e96fe-c7fe-424f-b86f-043aaa273d62\") " pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.205204 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/811e96fe-c7fe-424f-b86f-043aaa273d62-combined-ca-bundle\") pod \"keystone-b5955bfd6-zmfrz\" (UID: \"811e96fe-c7fe-424f-b86f-043aaa273d62\") " pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.205226 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/811e96fe-c7fe-424f-b86f-043aaa273d62-internal-tls-certs\") pod \"keystone-b5955bfd6-zmfrz\" (UID: \"811e96fe-c7fe-424f-b86f-043aaa273d62\") " pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.205264 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/811e96fe-c7fe-424f-b86f-043aaa273d62-credential-keys\") pod \"keystone-b5955bfd6-zmfrz\" (UID: \"811e96fe-c7fe-424f-b86f-043aaa273d62\") " pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.205297 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/811e96fe-c7fe-424f-b86f-043aaa273d62-scripts\") pod \"keystone-b5955bfd6-zmfrz\" (UID: \"811e96fe-c7fe-424f-b86f-043aaa273d62\") " pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.230306 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-f47bf59fd-ck79s"] Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.233015 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.235099 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.239227 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.239538 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.239794 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.239362 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7h2qr" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.253218 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f47bf59fd-ck79s"] Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.306938 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds55z\" (UniqueName: \"kubernetes.io/projected/811e96fe-c7fe-424f-b86f-043aaa273d62-kube-api-access-ds55z\") pod \"keystone-b5955bfd6-zmfrz\" (UID: \"811e96fe-c7fe-424f-b86f-043aaa273d62\") " pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.307756 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/811e96fe-c7fe-424f-b86f-043aaa273d62-fernet-keys\") pod \"keystone-b5955bfd6-zmfrz\" (UID: \"811e96fe-c7fe-424f-b86f-043aaa273d62\") " pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.307905 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/811e96fe-c7fe-424f-b86f-043aaa273d62-combined-ca-bundle\") pod \"keystone-b5955bfd6-zmfrz\" (UID: \"811e96fe-c7fe-424f-b86f-043aaa273d62\") " pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.308010 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/811e96fe-c7fe-424f-b86f-043aaa273d62-internal-tls-certs\") pod \"keystone-b5955bfd6-zmfrz\" (UID: \"811e96fe-c7fe-424f-b86f-043aaa273d62\") " pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.308130 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-internal-tls-certs\") pod \"placement-f47bf59fd-ck79s\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.308222 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/811e96fe-c7fe-424f-b86f-043aaa273d62-credential-keys\") pod \"keystone-b5955bfd6-zmfrz\" (UID: \"811e96fe-c7fe-424f-b86f-043aaa273d62\") " pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.308294 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-scripts\") pod \"placement-f47bf59fd-ck79s\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.308398 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/811e96fe-c7fe-424f-b86f-043aaa273d62-scripts\") pod \"keystone-b5955bfd6-zmfrz\" (UID: \"811e96fe-c7fe-424f-b86f-043aaa273d62\") " pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.309109 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-public-tls-certs\") pod \"placement-f47bf59fd-ck79s\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.309217 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-logs\") pod \"placement-f47bf59fd-ck79s\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.309317 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-combined-ca-bundle\") pod \"placement-f47bf59fd-ck79s\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.309504 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls8xd\" (UniqueName: \"kubernetes.io/projected/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-kube-api-access-ls8xd\") pod \"placement-f47bf59fd-ck79s\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.309774 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/811e96fe-c7fe-424f-b86f-043aaa273d62-config-data\") pod \"keystone-b5955bfd6-zmfrz\" (UID: \"811e96fe-c7fe-424f-b86f-043aaa273d62\") " pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.309892 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/811e96fe-c7fe-424f-b86f-043aaa273d62-public-tls-certs\") pod \"keystone-b5955bfd6-zmfrz\" (UID: \"811e96fe-c7fe-424f-b86f-043aaa273d62\") " pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.310397 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-config-data\") pod \"placement-f47bf59fd-ck79s\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.313827 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/811e96fe-c7fe-424f-b86f-043aaa273d62-internal-tls-certs\") pod \"keystone-b5955bfd6-zmfrz\" (UID: \"811e96fe-c7fe-424f-b86f-043aaa273d62\") " pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.314665 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/811e96fe-c7fe-424f-b86f-043aaa273d62-config-data\") pod \"keystone-b5955bfd6-zmfrz\" (UID: \"811e96fe-c7fe-424f-b86f-043aaa273d62\") " pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.315117 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/811e96fe-c7fe-424f-b86f-043aaa273d62-credential-keys\") pod \"keystone-b5955bfd6-zmfrz\" (UID: \"811e96fe-c7fe-424f-b86f-043aaa273d62\") " pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.315594 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/811e96fe-c7fe-424f-b86f-043aaa273d62-public-tls-certs\") pod \"keystone-b5955bfd6-zmfrz\" (UID: \"811e96fe-c7fe-424f-b86f-043aaa273d62\") " pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.316817 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/811e96fe-c7fe-424f-b86f-043aaa273d62-combined-ca-bundle\") pod \"keystone-b5955bfd6-zmfrz\" (UID: \"811e96fe-c7fe-424f-b86f-043aaa273d62\") " pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.324061 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/811e96fe-c7fe-424f-b86f-043aaa273d62-scripts\") pod \"keystone-b5955bfd6-zmfrz\" (UID: \"811e96fe-c7fe-424f-b86f-043aaa273d62\") " pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.324748 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds55z\" (UniqueName: \"kubernetes.io/projected/811e96fe-c7fe-424f-b86f-043aaa273d62-kube-api-access-ds55z\") pod \"keystone-b5955bfd6-zmfrz\" (UID: \"811e96fe-c7fe-424f-b86f-043aaa273d62\") " pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.324860 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/811e96fe-c7fe-424f-b86f-043aaa273d62-fernet-keys\") pod \"keystone-b5955bfd6-zmfrz\" (UID: \"811e96fe-c7fe-424f-b86f-043aaa273d62\") " pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.422714 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls8xd\" (UniqueName: \"kubernetes.io/projected/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-kube-api-access-ls8xd\") pod \"placement-f47bf59fd-ck79s\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.426451 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-config-data\") pod \"placement-f47bf59fd-ck79s\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.426827 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-internal-tls-certs\") pod \"placement-f47bf59fd-ck79s\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.426884 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-scripts\") pod \"placement-f47bf59fd-ck79s\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.427026 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-public-tls-certs\") pod \"placement-f47bf59fd-ck79s\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.427097 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-logs\") pod \"placement-f47bf59fd-ck79s\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.427153 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-combined-ca-bundle\") pod \"placement-f47bf59fd-ck79s\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.437164 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-logs\") pod \"placement-f47bf59fd-ck79s\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.449254 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-public-tls-certs\") pod \"placement-f47bf59fd-ck79s\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.453643 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-9749b5588-6wsv8"] Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.456683 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-combined-ca-bundle\") pod \"placement-f47bf59fd-ck79s\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.456945 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-internal-tls-certs\") pod \"placement-f47bf59fd-ck79s\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.457075 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.459159 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.466539 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-config-data\") pod \"placement-f47bf59fd-ck79s\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.485908 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-scripts\") pod \"placement-f47bf59fd-ck79s\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.495933 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9749b5588-6wsv8"] Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.514322 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls8xd\" (UniqueName: \"kubernetes.io/projected/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-kube-api-access-ls8xd\") pod \"placement-f47bf59fd-ck79s\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.528860 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f-public-tls-certs\") pod \"placement-9749b5588-6wsv8\" (UID: \"1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f\") " pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.529294 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f-config-data\") pod \"placement-9749b5588-6wsv8\" (UID: \"1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f\") " pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.529469 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f-scripts\") pod \"placement-9749b5588-6wsv8\" (UID: \"1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f\") " pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.529637 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f-internal-tls-certs\") pod \"placement-9749b5588-6wsv8\" (UID: \"1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f\") " pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.529789 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f-combined-ca-bundle\") pod \"placement-9749b5588-6wsv8\" (UID: \"1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f\") " pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.529921 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f-logs\") pod \"placement-9749b5588-6wsv8\" (UID: \"1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f\") " pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.530053 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rn68\" (UniqueName: \"kubernetes.io/projected/1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f-kube-api-access-4rn68\") pod \"placement-9749b5588-6wsv8\" (UID: \"1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f\") " pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.558983 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.633280 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f-public-tls-certs\") pod \"placement-9749b5588-6wsv8\" (UID: \"1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f\") " pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.633360 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f-config-data\") pod \"placement-9749b5588-6wsv8\" (UID: \"1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f\") " pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.633453 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f-scripts\") pod \"placement-9749b5588-6wsv8\" (UID: \"1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f\") " pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.633582 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f-internal-tls-certs\") pod \"placement-9749b5588-6wsv8\" (UID: \"1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f\") " pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.633643 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f-combined-ca-bundle\") pod \"placement-9749b5588-6wsv8\" (UID: \"1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f\") " pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.633682 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f-logs\") pod \"placement-9749b5588-6wsv8\" (UID: \"1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f\") " pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.633702 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rn68\" (UniqueName: \"kubernetes.io/projected/1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f-kube-api-access-4rn68\") pod \"placement-9749b5588-6wsv8\" (UID: \"1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f\") " pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.636728 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f-logs\") pod \"placement-9749b5588-6wsv8\" (UID: \"1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f\") " pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.638392 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f-scripts\") pod \"placement-9749b5588-6wsv8\" (UID: \"1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f\") " pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.640126 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f-internal-tls-certs\") pod \"placement-9749b5588-6wsv8\" (UID: \"1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f\") " pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.642302 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f-public-tls-certs\") pod \"placement-9749b5588-6wsv8\" (UID: \"1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f\") " pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.645509 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f-combined-ca-bundle\") pod \"placement-9749b5588-6wsv8\" (UID: \"1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f\") " pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.650114 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f-config-data\") pod \"placement-9749b5588-6wsv8\" (UID: \"1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f\") " pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.660486 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rn68\" (UniqueName: \"kubernetes.io/projected/1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f-kube-api-access-4rn68\") pod \"placement-9749b5588-6wsv8\" (UID: \"1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f\") " pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:29 crc kubenswrapper[4696]: I0318 15:57:29.909865 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:30 crc kubenswrapper[4696]: I0318 15:57:30.604720 4696 scope.go:117] "RemoveContainer" containerID="bd9db8d474d1dcc68d9bd773bc78267a4e93cb536ef69020e1530fce33747aa7" Mar 18 15:57:30 crc kubenswrapper[4696]: I0318 15:57:30.683763 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" Mar 18 15:57:30 crc kubenswrapper[4696]: I0318 15:57:30.715126 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fxg9t" event={"ID":"527c444b-3209-4c1e-addb-ed9404ab8efd","Type":"ContainerDied","Data":"e48fe629cc48746936fd4db776af58977aacfe39047a902dc8f828f2f21fc378"} Mar 18 15:57:30 crc kubenswrapper[4696]: I0318 15:57:30.715191 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e48fe629cc48746936fd4db776af58977aacfe39047a902dc8f828f2f21fc378" Mar 18 15:57:30 crc kubenswrapper[4696]: I0318 15:57:30.726663 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fxg9t" Mar 18 15:57:30 crc kubenswrapper[4696]: I0318 15:57:30.760470 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4nrh\" (UniqueName: \"kubernetes.io/projected/527c444b-3209-4c1e-addb-ed9404ab8efd-kube-api-access-n4nrh\") pod \"527c444b-3209-4c1e-addb-ed9404ab8efd\" (UID: \"527c444b-3209-4c1e-addb-ed9404ab8efd\") " Mar 18 15:57:30 crc kubenswrapper[4696]: I0318 15:57:30.760752 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/527c444b-3209-4c1e-addb-ed9404ab8efd-db-sync-config-data\") pod \"527c444b-3209-4c1e-addb-ed9404ab8efd\" (UID: \"527c444b-3209-4c1e-addb-ed9404ab8efd\") " Mar 18 15:57:30 crc kubenswrapper[4696]: I0318 15:57:30.760961 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527c444b-3209-4c1e-addb-ed9404ab8efd-combined-ca-bundle\") pod \"527c444b-3209-4c1e-addb-ed9404ab8efd\" (UID: \"527c444b-3209-4c1e-addb-ed9404ab8efd\") " Mar 18 15:57:30 crc kubenswrapper[4696]: I0318 15:57:30.766550 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/527c444b-3209-4c1e-addb-ed9404ab8efd-kube-api-access-n4nrh" (OuterVolumeSpecName: "kube-api-access-n4nrh") pod "527c444b-3209-4c1e-addb-ed9404ab8efd" (UID: "527c444b-3209-4c1e-addb-ed9404ab8efd"). InnerVolumeSpecName "kube-api-access-n4nrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:30 crc kubenswrapper[4696]: I0318 15:57:30.775312 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/527c444b-3209-4c1e-addb-ed9404ab8efd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "527c444b-3209-4c1e-addb-ed9404ab8efd" (UID: "527c444b-3209-4c1e-addb-ed9404ab8efd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:30 crc kubenswrapper[4696]: I0318 15:57:30.776822 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-ntvq9"] Mar 18 15:57:30 crc kubenswrapper[4696]: I0318 15:57:30.777195 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" podUID="81e814fe-6cea-48ae-88e9-00f367333f36" containerName="dnsmasq-dns" containerID="cri-o://a9650936976e85d31e0fa8c95e50634e9be205024d6e3a5ab50acbc6dacb772f" gracePeriod=10 Mar 18 15:57:30 crc kubenswrapper[4696]: I0318 15:57:30.826151 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/527c444b-3209-4c1e-addb-ed9404ab8efd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "527c444b-3209-4c1e-addb-ed9404ab8efd" (UID: "527c444b-3209-4c1e-addb-ed9404ab8efd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:30 crc kubenswrapper[4696]: I0318 15:57:30.870270 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527c444b-3209-4c1e-addb-ed9404ab8efd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:30 crc kubenswrapper[4696]: I0318 15:57:30.870333 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4nrh\" (UniqueName: \"kubernetes.io/projected/527c444b-3209-4c1e-addb-ed9404ab8efd-kube-api-access-n4nrh\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:30 crc kubenswrapper[4696]: I0318 15:57:30.870347 4696 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/527c444b-3209-4c1e-addb-ed9404ab8efd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:31 crc kubenswrapper[4696]: I0318 15:57:31.490605 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9749b5588-6wsv8"] Mar 18 15:57:31 crc kubenswrapper[4696]: W0318 15:57:31.574020 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e8286ea_ad27_496c_bcc2_d0cf5cd5e39f.slice/crio-15efb2d2ba77f71484bdc470997008afd76928221bfe3f0d9887113ccc2e6057 WatchSource:0}: Error finding container 15efb2d2ba77f71484bdc470997008afd76928221bfe3f0d9887113ccc2e6057: Status 404 returned error can't find the container with id 15efb2d2ba77f71484bdc470997008afd76928221bfe3f0d9887113ccc2e6057 Mar 18 15:57:31 crc kubenswrapper[4696]: I0318 15:57:31.776921 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e295c1a-9787-42a3-ac9c-5252bda652b5","Type":"ContainerStarted","Data":"151d0a071f9dc6476e0215d4b6a101e857bdb3953ef5da2933a196891fdd7548"} Mar 18 15:57:31 crc kubenswrapper[4696]: I0318 15:57:31.779864 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" Mar 18 15:57:31 crc kubenswrapper[4696]: I0318 15:57:31.782499 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9749b5588-6wsv8" event={"ID":"1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f","Type":"ContainerStarted","Data":"15efb2d2ba77f71484bdc470997008afd76928221bfe3f0d9887113ccc2e6057"} Mar 18 15:57:31 crc kubenswrapper[4696]: I0318 15:57:31.824156 4696 generic.go:334] "Generic (PLEG): container finished" podID="81e814fe-6cea-48ae-88e9-00f367333f36" containerID="a9650936976e85d31e0fa8c95e50634e9be205024d6e3a5ab50acbc6dacb772f" exitCode=0 Mar 18 15:57:31 crc kubenswrapper[4696]: I0318 15:57:31.824287 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fxg9t" Mar 18 15:57:31 crc kubenswrapper[4696]: I0318 15:57:31.824310 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" event={"ID":"81e814fe-6cea-48ae-88e9-00f367333f36","Type":"ContainerDied","Data":"a9650936976e85d31e0fa8c95e50634e9be205024d6e3a5ab50acbc6dacb772f"} Mar 18 15:57:31 crc kubenswrapper[4696]: I0318 15:57:31.824392 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" event={"ID":"81e814fe-6cea-48ae-88e9-00f367333f36","Type":"ContainerDied","Data":"fd21d4e4fc96e5d650394f785baa2d711150c683f7fa6d2153912f34597b27f2"} Mar 18 15:57:31 crc kubenswrapper[4696]: I0318 15:57:31.824466 4696 scope.go:117] "RemoveContainer" containerID="a9650936976e85d31e0fa8c95e50634e9be205024d6e3a5ab50acbc6dacb772f" Mar 18 15:57:31 crc kubenswrapper[4696]: I0318 15:57:31.824735 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-ntvq9" Mar 18 15:57:31 crc kubenswrapper[4696]: I0318 15:57:31.877892 4696 scope.go:117] "RemoveContainer" containerID="442bdfa16a4909af531e7413cea4a66acea9e514146809940548933da4e96366" Mar 18 15:57:31 crc kubenswrapper[4696]: I0318 15:57:31.917973 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pst8\" (UniqueName: \"kubernetes.io/projected/81e814fe-6cea-48ae-88e9-00f367333f36-kube-api-access-8pst8\") pod \"81e814fe-6cea-48ae-88e9-00f367333f36\" (UID: \"81e814fe-6cea-48ae-88e9-00f367333f36\") " Mar 18 15:57:31 crc kubenswrapper[4696]: I0318 15:57:31.918470 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-ovsdbserver-sb\") pod \"81e814fe-6cea-48ae-88e9-00f367333f36\" (UID: \"81e814fe-6cea-48ae-88e9-00f367333f36\") " Mar 18 15:57:31 crc kubenswrapper[4696]: I0318 15:57:31.918645 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-config\") pod \"81e814fe-6cea-48ae-88e9-00f367333f36\" (UID: \"81e814fe-6cea-48ae-88e9-00f367333f36\") " Mar 18 15:57:31 crc kubenswrapper[4696]: I0318 15:57:31.918947 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-dns-svc\") pod \"81e814fe-6cea-48ae-88e9-00f367333f36\" (UID: \"81e814fe-6cea-48ae-88e9-00f367333f36\") " Mar 18 15:57:31 crc kubenswrapper[4696]: I0318 15:57:31.919257 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-dns-swift-storage-0\") pod \"81e814fe-6cea-48ae-88e9-00f367333f36\" (UID: \"81e814fe-6cea-48ae-88e9-00f367333f36\") " Mar 18 15:57:31 crc kubenswrapper[4696]: I0318 15:57:31.919398 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-ovsdbserver-nb\") pod \"81e814fe-6cea-48ae-88e9-00f367333f36\" (UID: \"81e814fe-6cea-48ae-88e9-00f367333f36\") " Mar 18 15:57:31 crc kubenswrapper[4696]: I0318 15:57:31.929151 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81e814fe-6cea-48ae-88e9-00f367333f36-kube-api-access-8pst8" (OuterVolumeSpecName: "kube-api-access-8pst8") pod "81e814fe-6cea-48ae-88e9-00f367333f36" (UID: "81e814fe-6cea-48ae-88e9-00f367333f36"). InnerVolumeSpecName "kube-api-access-8pst8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:31 crc kubenswrapper[4696]: I0318 15:57:31.941346 4696 scope.go:117] "RemoveContainer" containerID="a9650936976e85d31e0fa8c95e50634e9be205024d6e3a5ab50acbc6dacb772f" Mar 18 15:57:31 crc kubenswrapper[4696]: E0318 15:57:31.945917 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9650936976e85d31e0fa8c95e50634e9be205024d6e3a5ab50acbc6dacb772f\": container with ID starting with a9650936976e85d31e0fa8c95e50634e9be205024d6e3a5ab50acbc6dacb772f not found: ID does not exist" containerID="a9650936976e85d31e0fa8c95e50634e9be205024d6e3a5ab50acbc6dacb772f" Mar 18 15:57:31 crc kubenswrapper[4696]: I0318 15:57:31.945989 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9650936976e85d31e0fa8c95e50634e9be205024d6e3a5ab50acbc6dacb772f"} err="failed to get container status \"a9650936976e85d31e0fa8c95e50634e9be205024d6e3a5ab50acbc6dacb772f\": rpc error: code = NotFound desc = could not find container \"a9650936976e85d31e0fa8c95e50634e9be205024d6e3a5ab50acbc6dacb772f\": container with ID starting with a9650936976e85d31e0fa8c95e50634e9be205024d6e3a5ab50acbc6dacb772f not found: ID does not exist" Mar 18 15:57:31 crc kubenswrapper[4696]: I0318 15:57:31.946028 4696 scope.go:117] "RemoveContainer" containerID="442bdfa16a4909af531e7413cea4a66acea9e514146809940548933da4e96366" Mar 18 15:57:31 crc kubenswrapper[4696]: E0318 15:57:31.946943 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"442bdfa16a4909af531e7413cea4a66acea9e514146809940548933da4e96366\": container with ID starting with 442bdfa16a4909af531e7413cea4a66acea9e514146809940548933da4e96366 not found: ID does not exist" containerID="442bdfa16a4909af531e7413cea4a66acea9e514146809940548933da4e96366" Mar 18 15:57:31 crc kubenswrapper[4696]: I0318 15:57:31.946973 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"442bdfa16a4909af531e7413cea4a66acea9e514146809940548933da4e96366"} err="failed to get container status \"442bdfa16a4909af531e7413cea4a66acea9e514146809940548933da4e96366\": rpc error: code = NotFound desc = could not find container \"442bdfa16a4909af531e7413cea4a66acea9e514146809940548933da4e96366\": container with ID starting with 442bdfa16a4909af531e7413cea4a66acea9e514146809940548933da4e96366 not found: ID does not exist" Mar 18 15:57:31 crc kubenswrapper[4696]: I0318 15:57:31.960075 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f47bf59fd-ck79s"] Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.003860 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b5955bfd6-zmfrz"] Mar 18 15:57:32 crc kubenswrapper[4696]: W0318 15:57:32.022665 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod811e96fe_c7fe_424f_b86f_043aaa273d62.slice/crio-7b0f88d4461c058a05771875e140336284b45739fa4ae64ad0faddc391e30917 WatchSource:0}: Error finding container 7b0f88d4461c058a05771875e140336284b45739fa4ae64ad0faddc391e30917: Status 404 returned error can't find the container with id 7b0f88d4461c058a05771875e140336284b45739fa4ae64ad0faddc391e30917 Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.025843 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pst8\" (UniqueName: \"kubernetes.io/projected/81e814fe-6cea-48ae-88e9-00f367333f36-kube-api-access-8pst8\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.212981 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "81e814fe-6cea-48ae-88e9-00f367333f36" (UID: "81e814fe-6cea-48ae-88e9-00f367333f36"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.243479 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.295630 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.323834 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "81e814fe-6cea-48ae-88e9-00f367333f36" (UID: "81e814fe-6cea-48ae-88e9-00f367333f36"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.345431 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.368316 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5d98fdb45f-pcbp7"] Mar 18 15:57:32 crc kubenswrapper[4696]: E0318 15:57:32.368825 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527c444b-3209-4c1e-addb-ed9404ab8efd" containerName="barbican-db-sync" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.368843 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="527c444b-3209-4c1e-addb-ed9404ab8efd" containerName="barbican-db-sync" Mar 18 15:57:32 crc kubenswrapper[4696]: E0318 15:57:32.368870 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81e814fe-6cea-48ae-88e9-00f367333f36" containerName="dnsmasq-dns" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.368876 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e814fe-6cea-48ae-88e9-00f367333f36" containerName="dnsmasq-dns" Mar 18 15:57:32 crc kubenswrapper[4696]: E0318 15:57:32.368900 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81e814fe-6cea-48ae-88e9-00f367333f36" containerName="init" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.368908 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e814fe-6cea-48ae-88e9-00f367333f36" containerName="init" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.369106 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="527c444b-3209-4c1e-addb-ed9404ab8efd" containerName="barbican-db-sync" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.369121 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="81e814fe-6cea-48ae-88e9-00f367333f36" containerName="dnsmasq-dns" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.370275 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5d98fdb45f-pcbp7" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.374793 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-27j6v" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.375040 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.375272 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.418287 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "81e814fe-6cea-48ae-88e9-00f367333f36" (UID: "81e814fe-6cea-48ae-88e9-00f367333f36"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.444505 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-config" (OuterVolumeSpecName: "config") pod "81e814fe-6cea-48ae-88e9-00f367333f36" (UID: "81e814fe-6cea-48ae-88e9-00f367333f36"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.447771 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48631acd-5b2b-48d2-9386-6e023de39655-config-data\") pod \"barbican-worker-5d98fdb45f-pcbp7\" (UID: \"48631acd-5b2b-48d2-9386-6e023de39655\") " pod="openstack/barbican-worker-5d98fdb45f-pcbp7" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.447841 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7p4m\" (UniqueName: \"kubernetes.io/projected/48631acd-5b2b-48d2-9386-6e023de39655-kube-api-access-w7p4m\") pod \"barbican-worker-5d98fdb45f-pcbp7\" (UID: \"48631acd-5b2b-48d2-9386-6e023de39655\") " pod="openstack/barbican-worker-5d98fdb45f-pcbp7" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.447952 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48631acd-5b2b-48d2-9386-6e023de39655-config-data-custom\") pod \"barbican-worker-5d98fdb45f-pcbp7\" (UID: \"48631acd-5b2b-48d2-9386-6e023de39655\") " pod="openstack/barbican-worker-5d98fdb45f-pcbp7" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.447994 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48631acd-5b2b-48d2-9386-6e023de39655-logs\") pod \"barbican-worker-5d98fdb45f-pcbp7\" (UID: \"48631acd-5b2b-48d2-9386-6e023de39655\") " pod="openstack/barbican-worker-5d98fdb45f-pcbp7" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.448014 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48631acd-5b2b-48d2-9386-6e023de39655-combined-ca-bundle\") pod \"barbican-worker-5d98fdb45f-pcbp7\" (UID: \"48631acd-5b2b-48d2-9386-6e023de39655\") " pod="openstack/barbican-worker-5d98fdb45f-pcbp7" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.448110 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.448128 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.457977 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "81e814fe-6cea-48ae-88e9-00f367333f36" (UID: "81e814fe-6cea-48ae-88e9-00f367333f36"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.470612 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-745b9b4c58-ztgcm"] Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.472542 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-745b9b4c58-ztgcm" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.482896 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.537098 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5d98fdb45f-pcbp7"] Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.561004 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8r4v\" (UniqueName: \"kubernetes.io/projected/07475c5d-ee2a-407e-986d-245ada3da65c-kube-api-access-l8r4v\") pod \"barbican-keystone-listener-745b9b4c58-ztgcm\" (UID: \"07475c5d-ee2a-407e-986d-245ada3da65c\") " pod="openstack/barbican-keystone-listener-745b9b4c58-ztgcm" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.561072 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48631acd-5b2b-48d2-9386-6e023de39655-config-data-custom\") pod \"barbican-worker-5d98fdb45f-pcbp7\" (UID: \"48631acd-5b2b-48d2-9386-6e023de39655\") " pod="openstack/barbican-worker-5d98fdb45f-pcbp7" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.561107 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07475c5d-ee2a-407e-986d-245ada3da65c-logs\") pod \"barbican-keystone-listener-745b9b4c58-ztgcm\" (UID: \"07475c5d-ee2a-407e-986d-245ada3da65c\") " pod="openstack/barbican-keystone-listener-745b9b4c58-ztgcm" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.561133 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48631acd-5b2b-48d2-9386-6e023de39655-logs\") pod \"barbican-worker-5d98fdb45f-pcbp7\" (UID: \"48631acd-5b2b-48d2-9386-6e023de39655\") " pod="openstack/barbican-worker-5d98fdb45f-pcbp7" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.561153 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07475c5d-ee2a-407e-986d-245ada3da65c-config-data-custom\") pod \"barbican-keystone-listener-745b9b4c58-ztgcm\" (UID: \"07475c5d-ee2a-407e-986d-245ada3da65c\") " pod="openstack/barbican-keystone-listener-745b9b4c58-ztgcm" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.561174 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48631acd-5b2b-48d2-9386-6e023de39655-combined-ca-bundle\") pod \"barbican-worker-5d98fdb45f-pcbp7\" (UID: \"48631acd-5b2b-48d2-9386-6e023de39655\") " pod="openstack/barbican-worker-5d98fdb45f-pcbp7" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.561234 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07475c5d-ee2a-407e-986d-245ada3da65c-config-data\") pod \"barbican-keystone-listener-745b9b4c58-ztgcm\" (UID: \"07475c5d-ee2a-407e-986d-245ada3da65c\") " pod="openstack/barbican-keystone-listener-745b9b4c58-ztgcm" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.561273 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48631acd-5b2b-48d2-9386-6e023de39655-config-data\") pod \"barbican-worker-5d98fdb45f-pcbp7\" (UID: \"48631acd-5b2b-48d2-9386-6e023de39655\") " pod="openstack/barbican-worker-5d98fdb45f-pcbp7" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.561310 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7p4m\" (UniqueName: \"kubernetes.io/projected/48631acd-5b2b-48d2-9386-6e023de39655-kube-api-access-w7p4m\") pod \"barbican-worker-5d98fdb45f-pcbp7\" (UID: \"48631acd-5b2b-48d2-9386-6e023de39655\") " pod="openstack/barbican-worker-5d98fdb45f-pcbp7" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.561337 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07475c5d-ee2a-407e-986d-245ada3da65c-combined-ca-bundle\") pod \"barbican-keystone-listener-745b9b4c58-ztgcm\" (UID: \"07475c5d-ee2a-407e-986d-245ada3da65c\") " pod="openstack/barbican-keystone-listener-745b9b4c58-ztgcm" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.561393 4696 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81e814fe-6cea-48ae-88e9-00f367333f36-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.564666 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48631acd-5b2b-48d2-9386-6e023de39655-logs\") pod \"barbican-worker-5d98fdb45f-pcbp7\" (UID: \"48631acd-5b2b-48d2-9386-6e023de39655\") " pod="openstack/barbican-worker-5d98fdb45f-pcbp7" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.578391 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48631acd-5b2b-48d2-9386-6e023de39655-combined-ca-bundle\") pod \"barbican-worker-5d98fdb45f-pcbp7\" (UID: \"48631acd-5b2b-48d2-9386-6e023de39655\") " pod="openstack/barbican-worker-5d98fdb45f-pcbp7" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.579381 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48631acd-5b2b-48d2-9386-6e023de39655-config-data\") pod \"barbican-worker-5d98fdb45f-pcbp7\" (UID: \"48631acd-5b2b-48d2-9386-6e023de39655\") " pod="openstack/barbican-worker-5d98fdb45f-pcbp7" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.591462 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48631acd-5b2b-48d2-9386-6e023de39655-config-data-custom\") pod \"barbican-worker-5d98fdb45f-pcbp7\" (UID: \"48631acd-5b2b-48d2-9386-6e023de39655\") " pod="openstack/barbican-worker-5d98fdb45f-pcbp7" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.613584 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-745b9b4c58-ztgcm"] Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.657221 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7p4m\" (UniqueName: \"kubernetes.io/projected/48631acd-5b2b-48d2-9386-6e023de39655-kube-api-access-w7p4m\") pod \"barbican-worker-5d98fdb45f-pcbp7\" (UID: \"48631acd-5b2b-48d2-9386-6e023de39655\") " pod="openstack/barbican-worker-5d98fdb45f-pcbp7" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.662462 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07475c5d-ee2a-407e-986d-245ada3da65c-config-data-custom\") pod \"barbican-keystone-listener-745b9b4c58-ztgcm\" (UID: \"07475c5d-ee2a-407e-986d-245ada3da65c\") " pod="openstack/barbican-keystone-listener-745b9b4c58-ztgcm" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.678407 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07475c5d-ee2a-407e-986d-245ada3da65c-config-data\") pod \"barbican-keystone-listener-745b9b4c58-ztgcm\" (UID: \"07475c5d-ee2a-407e-986d-245ada3da65c\") " pod="openstack/barbican-keystone-listener-745b9b4c58-ztgcm" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.678639 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07475c5d-ee2a-407e-986d-245ada3da65c-combined-ca-bundle\") pod \"barbican-keystone-listener-745b9b4c58-ztgcm\" (UID: \"07475c5d-ee2a-407e-986d-245ada3da65c\") " pod="openstack/barbican-keystone-listener-745b9b4c58-ztgcm" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.678723 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8r4v\" (UniqueName: \"kubernetes.io/projected/07475c5d-ee2a-407e-986d-245ada3da65c-kube-api-access-l8r4v\") pod \"barbican-keystone-listener-745b9b4c58-ztgcm\" (UID: \"07475c5d-ee2a-407e-986d-245ada3da65c\") " pod="openstack/barbican-keystone-listener-745b9b4c58-ztgcm" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.678841 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07475c5d-ee2a-407e-986d-245ada3da65c-logs\") pod \"barbican-keystone-listener-745b9b4c58-ztgcm\" (UID: \"07475c5d-ee2a-407e-986d-245ada3da65c\") " pod="openstack/barbican-keystone-listener-745b9b4c58-ztgcm" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.679316 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07475c5d-ee2a-407e-986d-245ada3da65c-logs\") pod \"barbican-keystone-listener-745b9b4c58-ztgcm\" (UID: \"07475c5d-ee2a-407e-986d-245ada3da65c\") " pod="openstack/barbican-keystone-listener-745b9b4c58-ztgcm" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.697676 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-bsk8g"] Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.699437 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.700350 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07475c5d-ee2a-407e-986d-245ada3da65c-combined-ca-bundle\") pod \"barbican-keystone-listener-745b9b4c58-ztgcm\" (UID: \"07475c5d-ee2a-407e-986d-245ada3da65c\") " pod="openstack/barbican-keystone-listener-745b9b4c58-ztgcm" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.700798 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07475c5d-ee2a-407e-986d-245ada3da65c-config-data-custom\") pod \"barbican-keystone-listener-745b9b4c58-ztgcm\" (UID: \"07475c5d-ee2a-407e-986d-245ada3da65c\") " pod="openstack/barbican-keystone-listener-745b9b4c58-ztgcm" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.720242 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07475c5d-ee2a-407e-986d-245ada3da65c-config-data\") pod \"barbican-keystone-listener-745b9b4c58-ztgcm\" (UID: \"07475c5d-ee2a-407e-986d-245ada3da65c\") " pod="openstack/barbican-keystone-listener-745b9b4c58-ztgcm" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.740330 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5d98fdb45f-pcbp7" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.741906 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8r4v\" (UniqueName: \"kubernetes.io/projected/07475c5d-ee2a-407e-986d-245ada3da65c-kube-api-access-l8r4v\") pod \"barbican-keystone-listener-745b9b4c58-ztgcm\" (UID: \"07475c5d-ee2a-407e-986d-245ada3da65c\") " pod="openstack/barbican-keystone-listener-745b9b4c58-ztgcm" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.758076 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-745b9b4c58-ztgcm" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.858427 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-bsk8g"] Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.883619 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-bsk8g\" (UID: \"0ce74221-62af-412b-89b0-c0c77c52a866\") " pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.885938 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-bsk8g\" (UID: \"0ce74221-62af-412b-89b0-c0c77c52a866\") " pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.886003 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-bsk8g\" (UID: \"0ce74221-62af-412b-89b0-c0c77c52a866\") " pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.886158 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q59lk\" (UniqueName: \"kubernetes.io/projected/0ce74221-62af-412b-89b0-c0c77c52a866-kube-api-access-q59lk\") pod \"dnsmasq-dns-848cf88cfc-bsk8g\" (UID: \"0ce74221-62af-412b-89b0-c0c77c52a866\") " pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.886209 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-bsk8g\" (UID: \"0ce74221-62af-412b-89b0-c0c77c52a866\") " pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.886273 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-config\") pod \"dnsmasq-dns-848cf88cfc-bsk8g\" (UID: \"0ce74221-62af-412b-89b0-c0c77c52a866\") " pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.909454 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"76a6f156-f710-4c15-a20f-649b27d7e7d6","Type":"ContainerStarted","Data":"484aa7107b415d042c57704dc78e4e352214915b423c058e1d51255140fcd3d8"} Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.927949 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9749b5588-6wsv8" event={"ID":"1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f","Type":"ContainerStarted","Data":"5c4e3d2599c6522c544f390c93771e1fe9cfdebcde76bf9d95c80682c300cf84"} Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.930368 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b5955bfd6-zmfrz" event={"ID":"811e96fe-c7fe-424f-b86f-043aaa273d62","Type":"ContainerStarted","Data":"7b0f88d4461c058a05771875e140336284b45739fa4ae64ad0faddc391e30917"} Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.939768 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6585944b98-nd6qg"] Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.941959 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6585944b98-nd6qg" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.947640 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.985872 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f47bf59fd-ck79s" event={"ID":"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf","Type":"ContainerStarted","Data":"9cb11409ee1f87729f96978da64e9fd742a05a957945d74b4568301269bbe9d2"} Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.990146 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-bsk8g\" (UID: \"0ce74221-62af-412b-89b0-c0c77c52a866\") " pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.990185 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-bsk8g\" (UID: \"0ce74221-62af-412b-89b0-c0c77c52a866\") " pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.990213 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-bsk8g\" (UID: \"0ce74221-62af-412b-89b0-c0c77c52a866\") " pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.990276 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q59lk\" (UniqueName: \"kubernetes.io/projected/0ce74221-62af-412b-89b0-c0c77c52a866-kube-api-access-q59lk\") pod \"dnsmasq-dns-848cf88cfc-bsk8g\" (UID: \"0ce74221-62af-412b-89b0-c0c77c52a866\") " pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.990311 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-bsk8g\" (UID: \"0ce74221-62af-412b-89b0-c0c77c52a866\") " pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.990356 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-config\") pod \"dnsmasq-dns-848cf88cfc-bsk8g\" (UID: \"0ce74221-62af-412b-89b0-c0c77c52a866\") " pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.990445 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086044f6-b566-472a-b30c-f710c801e907-combined-ca-bundle\") pod \"barbican-api-6585944b98-nd6qg\" (UID: \"086044f6-b566-472a-b30c-f710c801e907\") " pod="openstack/barbican-api-6585944b98-nd6qg" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.990498 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/086044f6-b566-472a-b30c-f710c801e907-logs\") pod \"barbican-api-6585944b98-nd6qg\" (UID: \"086044f6-b566-472a-b30c-f710c801e907\") " pod="openstack/barbican-api-6585944b98-nd6qg" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.990554 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/086044f6-b566-472a-b30c-f710c801e907-config-data-custom\") pod \"barbican-api-6585944b98-nd6qg\" (UID: \"086044f6-b566-472a-b30c-f710c801e907\") " pod="openstack/barbican-api-6585944b98-nd6qg" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.990603 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/086044f6-b566-472a-b30c-f710c801e907-config-data\") pod \"barbican-api-6585944b98-nd6qg\" (UID: \"086044f6-b566-472a-b30c-f710c801e907\") " pod="openstack/barbican-api-6585944b98-nd6qg" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.990646 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nn9d\" (UniqueName: \"kubernetes.io/projected/086044f6-b566-472a-b30c-f710c801e907-kube-api-access-9nn9d\") pod \"barbican-api-6585944b98-nd6qg\" (UID: \"086044f6-b566-472a-b30c-f710c801e907\") " pod="openstack/barbican-api-6585944b98-nd6qg" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.996776 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-bsk8g\" (UID: \"0ce74221-62af-412b-89b0-c0c77c52a866\") " pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" Mar 18 15:57:32 crc kubenswrapper[4696]: I0318 15:57:32.997433 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-bsk8g\" (UID: \"0ce74221-62af-412b-89b0-c0c77c52a866\") " pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" Mar 18 15:57:33 crc kubenswrapper[4696]: I0318 15:57:33.001863 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"190416e8-e777-4ec5-b017-8b28d749252e","Type":"ContainerStarted","Data":"d658a29f107b38ab5ad3e6cc848018faada651586a9cdf0017614ef7dd0e2c30"} Mar 18 15:57:33 crc kubenswrapper[4696]: I0318 15:57:33.004456 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-config\") pod \"dnsmasq-dns-848cf88cfc-bsk8g\" (UID: \"0ce74221-62af-412b-89b0-c0c77c52a866\") " pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" Mar 18 15:57:33 crc kubenswrapper[4696]: I0318 15:57:33.004486 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-bsk8g\" (UID: \"0ce74221-62af-412b-89b0-c0c77c52a866\") " pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" Mar 18 15:57:33 crc kubenswrapper[4696]: I0318 15:57:33.005971 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-bsk8g\" (UID: \"0ce74221-62af-412b-89b0-c0c77c52a866\") " pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" Mar 18 15:57:33 crc kubenswrapper[4696]: I0318 15:57:33.019306 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q59lk\" (UniqueName: \"kubernetes.io/projected/0ce74221-62af-412b-89b0-c0c77c52a866-kube-api-access-q59lk\") pod \"dnsmasq-dns-848cf88cfc-bsk8g\" (UID: \"0ce74221-62af-412b-89b0-c0c77c52a866\") " pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" Mar 18 15:57:33 crc kubenswrapper[4696]: I0318 15:57:33.019380 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6585944b98-nd6qg"] Mar 18 15:57:33 crc kubenswrapper[4696]: I0318 15:57:33.056657 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-ntvq9"] Mar 18 15:57:33 crc kubenswrapper[4696]: I0318 15:57:33.073069 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-ntvq9"] Mar 18 15:57:33 crc kubenswrapper[4696]: I0318 15:57:33.095447 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086044f6-b566-472a-b30c-f710c801e907-combined-ca-bundle\") pod \"barbican-api-6585944b98-nd6qg\" (UID: \"086044f6-b566-472a-b30c-f710c801e907\") " pod="openstack/barbican-api-6585944b98-nd6qg" Mar 18 15:57:33 crc kubenswrapper[4696]: I0318 15:57:33.095533 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/086044f6-b566-472a-b30c-f710c801e907-logs\") pod \"barbican-api-6585944b98-nd6qg\" (UID: \"086044f6-b566-472a-b30c-f710c801e907\") " pod="openstack/barbican-api-6585944b98-nd6qg" Mar 18 15:57:33 crc kubenswrapper[4696]: I0318 15:57:33.095562 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/086044f6-b566-472a-b30c-f710c801e907-config-data-custom\") pod \"barbican-api-6585944b98-nd6qg\" (UID: \"086044f6-b566-472a-b30c-f710c801e907\") " pod="openstack/barbican-api-6585944b98-nd6qg" Mar 18 15:57:33 crc kubenswrapper[4696]: I0318 15:57:33.095614 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/086044f6-b566-472a-b30c-f710c801e907-config-data\") pod \"barbican-api-6585944b98-nd6qg\" (UID: \"086044f6-b566-472a-b30c-f710c801e907\") " pod="openstack/barbican-api-6585944b98-nd6qg" Mar 18 15:57:33 crc kubenswrapper[4696]: I0318 15:57:33.095639 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nn9d\" (UniqueName: \"kubernetes.io/projected/086044f6-b566-472a-b30c-f710c801e907-kube-api-access-9nn9d\") pod \"barbican-api-6585944b98-nd6qg\" (UID: \"086044f6-b566-472a-b30c-f710c801e907\") " pod="openstack/barbican-api-6585944b98-nd6qg" Mar 18 15:57:33 crc kubenswrapper[4696]: I0318 15:57:33.097664 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/086044f6-b566-472a-b30c-f710c801e907-logs\") pod \"barbican-api-6585944b98-nd6qg\" (UID: \"086044f6-b566-472a-b30c-f710c801e907\") " pod="openstack/barbican-api-6585944b98-nd6qg" Mar 18 15:57:33 crc kubenswrapper[4696]: I0318 15:57:33.103301 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/086044f6-b566-472a-b30c-f710c801e907-config-data-custom\") pod \"barbican-api-6585944b98-nd6qg\" (UID: \"086044f6-b566-472a-b30c-f710c801e907\") " pod="openstack/barbican-api-6585944b98-nd6qg" Mar 18 15:57:33 crc kubenswrapper[4696]: I0318 15:57:33.103828 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/086044f6-b566-472a-b30c-f710c801e907-config-data\") pod \"barbican-api-6585944b98-nd6qg\" (UID: \"086044f6-b566-472a-b30c-f710c801e907\") " pod="openstack/barbican-api-6585944b98-nd6qg" Mar 18 15:57:33 crc kubenswrapper[4696]: I0318 15:57:33.106590 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086044f6-b566-472a-b30c-f710c801e907-combined-ca-bundle\") pod \"barbican-api-6585944b98-nd6qg\" (UID: \"086044f6-b566-472a-b30c-f710c801e907\") " pod="openstack/barbican-api-6585944b98-nd6qg" Mar 18 15:57:33 crc kubenswrapper[4696]: I0318 15:57:33.121213 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nn9d\" (UniqueName: \"kubernetes.io/projected/086044f6-b566-472a-b30c-f710c801e907-kube-api-access-9nn9d\") pod \"barbican-api-6585944b98-nd6qg\" (UID: \"086044f6-b566-472a-b30c-f710c801e907\") " pod="openstack/barbican-api-6585944b98-nd6qg" Mar 18 15:57:33 crc kubenswrapper[4696]: I0318 15:57:33.223324 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" Mar 18 15:57:33 crc kubenswrapper[4696]: I0318 15:57:33.270259 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6585944b98-nd6qg" Mar 18 15:57:33 crc kubenswrapper[4696]: I0318 15:57:33.432347 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5d98fdb45f-pcbp7"] Mar 18 15:57:33 crc kubenswrapper[4696]: I0318 15:57:33.616341 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81e814fe-6cea-48ae-88e9-00f367333f36" path="/var/lib/kubelet/pods/81e814fe-6cea-48ae-88e9-00f367333f36/volumes" Mar 18 15:57:33 crc kubenswrapper[4696]: I0318 15:57:33.865702 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-745b9b4c58-ztgcm"] Mar 18 15:57:33 crc kubenswrapper[4696]: W0318 15:57:33.918611 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07475c5d_ee2a_407e_986d_245ada3da65c.slice/crio-4f71298116d8c9eb236571b479bd525457f016d6eb0404af67da659a2bea5d0a WatchSource:0}: Error finding container 4f71298116d8c9eb236571b479bd525457f016d6eb0404af67da659a2bea5d0a: Status 404 returned error can't find the container with id 4f71298116d8c9eb236571b479bd525457f016d6eb0404af67da659a2bea5d0a Mar 18 15:57:34 crc kubenswrapper[4696]: I0318 15:57:34.027967 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d98fdb45f-pcbp7" event={"ID":"48631acd-5b2b-48d2-9386-6e023de39655","Type":"ContainerStarted","Data":"c1c2e9ecc438312656db379c9e79ffba656ff0a5f449ce46544a89d4785ee678"} Mar 18 15:57:34 crc kubenswrapper[4696]: I0318 15:57:34.044038 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b5955bfd6-zmfrz" event={"ID":"811e96fe-c7fe-424f-b86f-043aaa273d62","Type":"ContainerStarted","Data":"9744ccb3e453b4cd89525db45107a13020231c5feb434b2fa6b1f54c1c72ed51"} Mar 18 15:57:34 crc kubenswrapper[4696]: I0318 15:57:34.045355 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:57:34 crc kubenswrapper[4696]: I0318 15:57:34.074369 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-b5955bfd6-zmfrz" podStartSLOduration=5.074344232 podStartE2EDuration="5.074344232s" podCreationTimestamp="2026-03-18 15:57:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:34.070584788 +0000 UTC m=+1297.076759004" watchObservedRunningTime="2026-03-18 15:57:34.074344232 +0000 UTC m=+1297.080518438" Mar 18 15:57:34 crc kubenswrapper[4696]: I0318 15:57:34.078081 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f47bf59fd-ck79s" event={"ID":"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf","Type":"ContainerStarted","Data":"7f41d9f0a8d6c8cb412fd5cc67eacd4fd815324b30cb36060c28fe3198315f6f"} Mar 18 15:57:34 crc kubenswrapper[4696]: I0318 15:57:34.082031 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"190416e8-e777-4ec5-b017-8b28d749252e","Type":"ContainerStarted","Data":"7ecf4ec8c1ee1a7ace92b6c49a1bcdeee72baeaba961275535af959633d7d411"} Mar 18 15:57:34 crc kubenswrapper[4696]: I0318 15:57:34.118947 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-745b9b4c58-ztgcm" event={"ID":"07475c5d-ee2a-407e-986d-245ada3da65c","Type":"ContainerStarted","Data":"4f71298116d8c9eb236571b479bd525457f016d6eb0404af67da659a2bea5d0a"} Mar 18 15:57:34 crc kubenswrapper[4696]: I0318 15:57:34.138112 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9749b5588-6wsv8" event={"ID":"1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f","Type":"ContainerStarted","Data":"c8c24f50d7c5f03c71290b24df7279e01159e8b6505718ea748a401ea3897147"} Mar 18 15:57:34 crc kubenswrapper[4696]: I0318 15:57:34.138683 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:34 crc kubenswrapper[4696]: I0318 15:57:34.138854 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:57:34 crc kubenswrapper[4696]: I0318 15:57:34.161262 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=10.161234439 podStartE2EDuration="10.161234439s" podCreationTimestamp="2026-03-18 15:57:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:34.130071518 +0000 UTC m=+1297.136245744" watchObservedRunningTime="2026-03-18 15:57:34.161234439 +0000 UTC m=+1297.167408645" Mar 18 15:57:34 crc kubenswrapper[4696]: I0318 15:57:34.195027 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-9749b5588-6wsv8" podStartSLOduration=5.194998335 podStartE2EDuration="5.194998335s" podCreationTimestamp="2026-03-18 15:57:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:34.173165178 +0000 UTC m=+1297.179339384" watchObservedRunningTime="2026-03-18 15:57:34.194998335 +0000 UTC m=+1297.201172551" Mar 18 15:57:34 crc kubenswrapper[4696]: W0318 15:57:34.242733 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod086044f6_b566_472a_b30c_f710c801e907.slice/crio-480a352bf79d9b9ed06c148db225ba22efe933a0581455483e6596794d516c7d WatchSource:0}: Error finding container 480a352bf79d9b9ed06c148db225ba22efe933a0581455483e6596794d516c7d: Status 404 returned error can't find the container with id 480a352bf79d9b9ed06c148db225ba22efe933a0581455483e6596794d516c7d Mar 18 15:57:34 crc kubenswrapper[4696]: I0318 15:57:34.245567 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6585944b98-nd6qg"] Mar 18 15:57:34 crc kubenswrapper[4696]: I0318 15:57:34.376891 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-bsk8g"] Mar 18 15:57:34 crc kubenswrapper[4696]: W0318 15:57:34.429658 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ce74221_62af_412b_89b0_c0c77c52a866.slice/crio-af28a161c04b98d94857fa91652fbf7aad564fc4074212a49920b2c9adad6063 WatchSource:0}: Error finding container af28a161c04b98d94857fa91652fbf7aad564fc4074212a49920b2c9adad6063: Status 404 returned error can't find the container with id af28a161c04b98d94857fa91652fbf7aad564fc4074212a49920b2c9adad6063 Mar 18 15:57:35 crc kubenswrapper[4696]: I0318 15:57:35.132679 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 15:57:35 crc kubenswrapper[4696]: I0318 15:57:35.133670 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 15:57:35 crc kubenswrapper[4696]: I0318 15:57:35.163027 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6585944b98-nd6qg" event={"ID":"086044f6-b566-472a-b30c-f710c801e907","Type":"ContainerStarted","Data":"88c7dc156b455c2a08eca47c237b9054c7d58ca60c2cbe12d9199e8d0cfd87af"} Mar 18 15:57:35 crc kubenswrapper[4696]: I0318 15:57:35.163114 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6585944b98-nd6qg" event={"ID":"086044f6-b566-472a-b30c-f710c801e907","Type":"ContainerStarted","Data":"480a352bf79d9b9ed06c148db225ba22efe933a0581455483e6596794d516c7d"} Mar 18 15:57:35 crc kubenswrapper[4696]: I0318 15:57:35.169957 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-v2tzn" event={"ID":"4a76866d-35bf-4dee-8fc4-a5c018e9edce","Type":"ContainerStarted","Data":"58cc05b26e706e9f4326b89ac51b0d5c084fb76d960d336bc6aa34f3b91da54b"} Mar 18 15:57:35 crc kubenswrapper[4696]: I0318 15:57:35.187606 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f47bf59fd-ck79s" event={"ID":"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf","Type":"ContainerStarted","Data":"cf3d2d8cfcb8e553e7e797548c7da1178fcd4fbb4098da2325fbd44f65acf1c4"} Mar 18 15:57:35 crc kubenswrapper[4696]: I0318 15:57:35.188687 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:35 crc kubenswrapper[4696]: I0318 15:57:35.188907 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:57:35 crc kubenswrapper[4696]: I0318 15:57:35.193772 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-v2tzn" podStartSLOduration=3.594813313 podStartE2EDuration="48.193750551s" podCreationTimestamp="2026-03-18 15:56:47 +0000 UTC" firstStartedPulling="2026-03-18 15:56:48.936145291 +0000 UTC m=+1251.942319497" lastFinishedPulling="2026-03-18 15:57:33.535082529 +0000 UTC m=+1296.541256735" observedRunningTime="2026-03-18 15:57:35.191962456 +0000 UTC m=+1298.198136662" watchObservedRunningTime="2026-03-18 15:57:35.193750551 +0000 UTC m=+1298.199924757" Mar 18 15:57:35 crc kubenswrapper[4696]: I0318 15:57:35.201105 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 15:57:35 crc kubenswrapper[4696]: I0318 15:57:35.201354 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"76a6f156-f710-4c15-a20f-649b27d7e7d6","Type":"ContainerStarted","Data":"5e5ae1c6a7cbc0df617fc4b8a0ecc642e9ec06947dc3b86c1ea6e07b6bc466bf"} Mar 18 15:57:35 crc kubenswrapper[4696]: I0318 15:57:35.201859 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 15:57:35 crc kubenswrapper[4696]: I0318 15:57:35.225792 4696 generic.go:334] "Generic (PLEG): container finished" podID="0ce74221-62af-412b-89b0-c0c77c52a866" containerID="a2a1979bb752a3a3f9f0b555210646feae5b0a15e560f437b6cb6b60163a65f8" exitCode=0 Mar 18 15:57:35 crc kubenswrapper[4696]: I0318 15:57:35.226005 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" event={"ID":"0ce74221-62af-412b-89b0-c0c77c52a866","Type":"ContainerDied","Data":"a2a1979bb752a3a3f9f0b555210646feae5b0a15e560f437b6cb6b60163a65f8"} Mar 18 15:57:35 crc kubenswrapper[4696]: I0318 15:57:35.226042 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" event={"ID":"0ce74221-62af-412b-89b0-c0c77c52a866","Type":"ContainerStarted","Data":"af28a161c04b98d94857fa91652fbf7aad564fc4074212a49920b2c9adad6063"} Mar 18 15:57:35 crc kubenswrapper[4696]: I0318 15:57:35.227766 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 15:57:35 crc kubenswrapper[4696]: I0318 15:57:35.227805 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 15:57:35 crc kubenswrapper[4696]: I0318 15:57:35.241985 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-f47bf59fd-ck79s" podStartSLOduration=6.241958839 podStartE2EDuration="6.241958839s" podCreationTimestamp="2026-03-18 15:57:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:35.225775624 +0000 UTC m=+1298.231949830" watchObservedRunningTime="2026-03-18 15:57:35.241958839 +0000 UTC m=+1298.248133035" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.254895 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" event={"ID":"0ce74221-62af-412b-89b0-c0c77c52a866","Type":"ContainerStarted","Data":"70ca343ae08649a3a352c24d3ffd4c61548152247d67bac2b2db5a7b44c16690"} Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.255240 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.261265 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-756f5c8c54-wjfvb"] Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.307736 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.311975 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6585944b98-nd6qg" event={"ID":"086044f6-b566-472a-b30c-f710c801e907","Type":"ContainerStarted","Data":"2c1e1103e8d1f94761ac10822b7d945aaf9d9f6082020560fd58e25f71aebed5"} Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.312161 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6585944b98-nd6qg" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.312210 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6585944b98-nd6qg" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.315173 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.328005 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.327490 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"76a6f156-f710-4c15-a20f-649b27d7e7d6","Type":"ContainerStarted","Data":"b6bfeef15251ff1e947a9ad69fb199e9f96dbcfadb87bc4e3a6bca3d0cce8f79"} Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.352985 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-756f5c8c54-wjfvb"] Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.393203 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" podStartSLOduration=4.393169375 podStartE2EDuration="4.393169375s" podCreationTimestamp="2026-03-18 15:57:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:36.325990722 +0000 UTC m=+1299.332164948" watchObservedRunningTime="2026-03-18 15:57:36.393169375 +0000 UTC m=+1299.399343591" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.419246 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d221056-d9f9-47b1-9871-65a83cd55cb4-combined-ca-bundle\") pod \"barbican-api-756f5c8c54-wjfvb\" (UID: \"0d221056-d9f9-47b1-9871-65a83cd55cb4\") " pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.419778 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d221056-d9f9-47b1-9871-65a83cd55cb4-internal-tls-certs\") pod \"barbican-api-756f5c8c54-wjfvb\" (UID: \"0d221056-d9f9-47b1-9871-65a83cd55cb4\") " pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.420130 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d221056-d9f9-47b1-9871-65a83cd55cb4-config-data-custom\") pod \"barbican-api-756f5c8c54-wjfvb\" (UID: \"0d221056-d9f9-47b1-9871-65a83cd55cb4\") " pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.420269 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d221056-d9f9-47b1-9871-65a83cd55cb4-logs\") pod \"barbican-api-756f5c8c54-wjfvb\" (UID: \"0d221056-d9f9-47b1-9871-65a83cd55cb4\") " pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.420423 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-858nw\" (UniqueName: \"kubernetes.io/projected/0d221056-d9f9-47b1-9871-65a83cd55cb4-kube-api-access-858nw\") pod \"barbican-api-756f5c8c54-wjfvb\" (UID: \"0d221056-d9f9-47b1-9871-65a83cd55cb4\") " pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.420742 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d221056-d9f9-47b1-9871-65a83cd55cb4-public-tls-certs\") pod \"barbican-api-756f5c8c54-wjfvb\" (UID: \"0d221056-d9f9-47b1-9871-65a83cd55cb4\") " pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.421132 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d221056-d9f9-47b1-9871-65a83cd55cb4-config-data\") pod \"barbican-api-756f5c8c54-wjfvb\" (UID: \"0d221056-d9f9-47b1-9871-65a83cd55cb4\") " pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.442406 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6585944b98-nd6qg" podStartSLOduration=4.442383298 podStartE2EDuration="4.442383298s" podCreationTimestamp="2026-03-18 15:57:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:36.385658987 +0000 UTC m=+1299.391833213" watchObservedRunningTime="2026-03-18 15:57:36.442383298 +0000 UTC m=+1299.448557504" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.453558 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-696476876d-4rxz2" podUID="72021b21-00cf-4c33-be2d-b24f20dc0f9f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.461419 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=11.461398115 podStartE2EDuration="11.461398115s" podCreationTimestamp="2026-03-18 15:57:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:36.432445489 +0000 UTC m=+1299.438619705" watchObservedRunningTime="2026-03-18 15:57:36.461398115 +0000 UTC m=+1299.467572321" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.529918 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d221056-d9f9-47b1-9871-65a83cd55cb4-combined-ca-bundle\") pod \"barbican-api-756f5c8c54-wjfvb\" (UID: \"0d221056-d9f9-47b1-9871-65a83cd55cb4\") " pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.530004 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d221056-d9f9-47b1-9871-65a83cd55cb4-internal-tls-certs\") pod \"barbican-api-756f5c8c54-wjfvb\" (UID: \"0d221056-d9f9-47b1-9871-65a83cd55cb4\") " pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.530109 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d221056-d9f9-47b1-9871-65a83cd55cb4-config-data-custom\") pod \"barbican-api-756f5c8c54-wjfvb\" (UID: \"0d221056-d9f9-47b1-9871-65a83cd55cb4\") " pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.530151 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d221056-d9f9-47b1-9871-65a83cd55cb4-logs\") pod \"barbican-api-756f5c8c54-wjfvb\" (UID: \"0d221056-d9f9-47b1-9871-65a83cd55cb4\") " pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.530272 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-858nw\" (UniqueName: \"kubernetes.io/projected/0d221056-d9f9-47b1-9871-65a83cd55cb4-kube-api-access-858nw\") pod \"barbican-api-756f5c8c54-wjfvb\" (UID: \"0d221056-d9f9-47b1-9871-65a83cd55cb4\") " pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.530365 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d221056-d9f9-47b1-9871-65a83cd55cb4-public-tls-certs\") pod \"barbican-api-756f5c8c54-wjfvb\" (UID: \"0d221056-d9f9-47b1-9871-65a83cd55cb4\") " pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.530579 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d221056-d9f9-47b1-9871-65a83cd55cb4-config-data\") pod \"barbican-api-756f5c8c54-wjfvb\" (UID: \"0d221056-d9f9-47b1-9871-65a83cd55cb4\") " pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.536579 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d221056-d9f9-47b1-9871-65a83cd55cb4-logs\") pod \"barbican-api-756f5c8c54-wjfvb\" (UID: \"0d221056-d9f9-47b1-9871-65a83cd55cb4\") " pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.552475 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d221056-d9f9-47b1-9871-65a83cd55cb4-combined-ca-bundle\") pod \"barbican-api-756f5c8c54-wjfvb\" (UID: \"0d221056-d9f9-47b1-9871-65a83cd55cb4\") " pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.552732 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d221056-d9f9-47b1-9871-65a83cd55cb4-internal-tls-certs\") pod \"barbican-api-756f5c8c54-wjfvb\" (UID: \"0d221056-d9f9-47b1-9871-65a83cd55cb4\") " pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.552844 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d221056-d9f9-47b1-9871-65a83cd55cb4-config-data-custom\") pod \"barbican-api-756f5c8c54-wjfvb\" (UID: \"0d221056-d9f9-47b1-9871-65a83cd55cb4\") " pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.553966 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d221056-d9f9-47b1-9871-65a83cd55cb4-public-tls-certs\") pod \"barbican-api-756f5c8c54-wjfvb\" (UID: \"0d221056-d9f9-47b1-9871-65a83cd55cb4\") " pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.554313 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d221056-d9f9-47b1-9871-65a83cd55cb4-config-data\") pod \"barbican-api-756f5c8c54-wjfvb\" (UID: \"0d221056-d9f9-47b1-9871-65a83cd55cb4\") " pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.557822 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-858nw\" (UniqueName: \"kubernetes.io/projected/0d221056-d9f9-47b1-9871-65a83cd55cb4-kube-api-access-858nw\") pod \"barbican-api-756f5c8c54-wjfvb\" (UID: \"0d221056-d9f9-47b1-9871-65a83cd55cb4\") " pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.600253 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-59764c649b-dxxpb" podUID="abd090d6-037c-4cc7-907a-43293ce636ff" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Mar 18 15:57:36 crc kubenswrapper[4696]: I0318 15:57:36.650834 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:37 crc kubenswrapper[4696]: I0318 15:57:37.898110 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-756f5c8c54-wjfvb"] Mar 18 15:57:37 crc kubenswrapper[4696]: W0318 15:57:37.910882 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d221056_d9f9_47b1_9871_65a83cd55cb4.slice/crio-00dbc3634e1df72f64d7519c3b67441aa407b86e1291774460bf885b1815580d WatchSource:0}: Error finding container 00dbc3634e1df72f64d7519c3b67441aa407b86e1291774460bf885b1815580d: Status 404 returned error can't find the container with id 00dbc3634e1df72f64d7519c3b67441aa407b86e1291774460bf885b1815580d Mar 18 15:57:38 crc kubenswrapper[4696]: I0318 15:57:38.352776 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-756f5c8c54-wjfvb" event={"ID":"0d221056-d9f9-47b1-9871-65a83cd55cb4","Type":"ContainerStarted","Data":"041510c3c4d78f089863f0f9bab4635278d5bded1d7d06ee78492c0d5b1d99da"} Mar 18 15:57:38 crc kubenswrapper[4696]: I0318 15:57:38.352871 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-756f5c8c54-wjfvb" event={"ID":"0d221056-d9f9-47b1-9871-65a83cd55cb4","Type":"ContainerStarted","Data":"00dbc3634e1df72f64d7519c3b67441aa407b86e1291774460bf885b1815580d"} Mar 18 15:57:38 crc kubenswrapper[4696]: I0318 15:57:38.367328 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-745b9b4c58-ztgcm" event={"ID":"07475c5d-ee2a-407e-986d-245ada3da65c","Type":"ContainerStarted","Data":"99b7fa6f828bbf40e60d3d2585f9bdfefc8a79d0d07e18398d81c277e184f12e"} Mar 18 15:57:38 crc kubenswrapper[4696]: I0318 15:57:38.367393 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-745b9b4c58-ztgcm" event={"ID":"07475c5d-ee2a-407e-986d-245ada3da65c","Type":"ContainerStarted","Data":"99aad4381fac3c9f1d3019ccef0d84d8f17252b331f325b603a4c31d4d7e4e2f"} Mar 18 15:57:38 crc kubenswrapper[4696]: I0318 15:57:38.376149 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d98fdb45f-pcbp7" event={"ID":"48631acd-5b2b-48d2-9386-6e023de39655","Type":"ContainerStarted","Data":"05c173d43129446a067202a5fedf6c76475f2ccc2df0c6c83895d63f8d680e4e"} Mar 18 15:57:38 crc kubenswrapper[4696]: I0318 15:57:38.376213 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d98fdb45f-pcbp7" event={"ID":"48631acd-5b2b-48d2-9386-6e023de39655","Type":"ContainerStarted","Data":"80271cc55a20f6202f7bbfec45c038a91e13840297bd75f292efa9cf05941631"} Mar 18 15:57:38 crc kubenswrapper[4696]: I0318 15:57:38.428426 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-745b9b4c58-ztgcm" podStartSLOduration=2.953817188 podStartE2EDuration="6.42837589s" podCreationTimestamp="2026-03-18 15:57:32 +0000 UTC" firstStartedPulling="2026-03-18 15:57:33.92781003 +0000 UTC m=+1296.933984236" lastFinishedPulling="2026-03-18 15:57:37.402368732 +0000 UTC m=+1300.408542938" observedRunningTime="2026-03-18 15:57:38.395901677 +0000 UTC m=+1301.402075893" watchObservedRunningTime="2026-03-18 15:57:38.42837589 +0000 UTC m=+1301.434550096" Mar 18 15:57:38 crc kubenswrapper[4696]: I0318 15:57:38.434243 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5d98fdb45f-pcbp7" podStartSLOduration=2.562622626 podStartE2EDuration="6.434220687s" podCreationTimestamp="2026-03-18 15:57:32 +0000 UTC" firstStartedPulling="2026-03-18 15:57:33.529231902 +0000 UTC m=+1296.535406108" lastFinishedPulling="2026-03-18 15:57:37.400829963 +0000 UTC m=+1300.407004169" observedRunningTime="2026-03-18 15:57:38.427290223 +0000 UTC m=+1301.433464429" watchObservedRunningTime="2026-03-18 15:57:38.434220687 +0000 UTC m=+1301.440394893" Mar 18 15:57:39 crc kubenswrapper[4696]: I0318 15:57:39.024382 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 15:57:39 crc kubenswrapper[4696]: I0318 15:57:39.034107 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 15:57:39 crc kubenswrapper[4696]: I0318 15:57:39.389620 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-756f5c8c54-wjfvb" event={"ID":"0d221056-d9f9-47b1-9871-65a83cd55cb4","Type":"ContainerStarted","Data":"18407037a8a2388dc80170a75b92be1511c78bfa14e9dd88acc7ad25da0ea466"} Mar 18 15:57:39 crc kubenswrapper[4696]: I0318 15:57:39.389773 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:39 crc kubenswrapper[4696]: I0318 15:57:39.389853 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:39 crc kubenswrapper[4696]: I0318 15:57:39.428852 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-756f5c8c54-wjfvb" podStartSLOduration=3.428823969 podStartE2EDuration="3.428823969s" podCreationTimestamp="2026-03-18 15:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:39.416786797 +0000 UTC m=+1302.422961003" watchObservedRunningTime="2026-03-18 15:57:39.428823969 +0000 UTC m=+1302.434998175" Mar 18 15:57:41 crc kubenswrapper[4696]: I0318 15:57:41.424186 4696 generic.go:334] "Generic (PLEG): container finished" podID="4a76866d-35bf-4dee-8fc4-a5c018e9edce" containerID="58cc05b26e706e9f4326b89ac51b0d5c084fb76d960d336bc6aa34f3b91da54b" exitCode=0 Mar 18 15:57:41 crc kubenswrapper[4696]: I0318 15:57:41.424304 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-v2tzn" event={"ID":"4a76866d-35bf-4dee-8fc4-a5c018e9edce","Type":"ContainerDied","Data":"58cc05b26e706e9f4326b89ac51b0d5c084fb76d960d336bc6aa34f3b91da54b"} Mar 18 15:57:42 crc kubenswrapper[4696]: I0318 15:57:42.876173 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-v2tzn" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.006925 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kwjw\" (UniqueName: \"kubernetes.io/projected/4a76866d-35bf-4dee-8fc4-a5c018e9edce-kube-api-access-5kwjw\") pod \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\" (UID: \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\") " Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.007637 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a76866d-35bf-4dee-8fc4-a5c018e9edce-combined-ca-bundle\") pod \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\" (UID: \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\") " Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.007860 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a76866d-35bf-4dee-8fc4-a5c018e9edce-etc-machine-id\") pod \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\" (UID: \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\") " Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.007912 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a76866d-35bf-4dee-8fc4-a5c018e9edce-scripts\") pod \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\" (UID: \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\") " Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.008106 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a76866d-35bf-4dee-8fc4-a5c018e9edce-config-data\") pod \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\" (UID: \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\") " Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.008134 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a76866d-35bf-4dee-8fc4-a5c018e9edce-db-sync-config-data\") pod \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\" (UID: \"4a76866d-35bf-4dee-8fc4-a5c018e9edce\") " Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.008436 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a76866d-35bf-4dee-8fc4-a5c018e9edce-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4a76866d-35bf-4dee-8fc4-a5c018e9edce" (UID: "4a76866d-35bf-4dee-8fc4-a5c018e9edce"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.009492 4696 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a76866d-35bf-4dee-8fc4-a5c018e9edce-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.017745 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a76866d-35bf-4dee-8fc4-a5c018e9edce-kube-api-access-5kwjw" (OuterVolumeSpecName: "kube-api-access-5kwjw") pod "4a76866d-35bf-4dee-8fc4-a5c018e9edce" (UID: "4a76866d-35bf-4dee-8fc4-a5c018e9edce"). InnerVolumeSpecName "kube-api-access-5kwjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.034975 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a76866d-35bf-4dee-8fc4-a5c018e9edce-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4a76866d-35bf-4dee-8fc4-a5c018e9edce" (UID: "4a76866d-35bf-4dee-8fc4-a5c018e9edce"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.037253 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a76866d-35bf-4dee-8fc4-a5c018e9edce-scripts" (OuterVolumeSpecName: "scripts") pod "4a76866d-35bf-4dee-8fc4-a5c018e9edce" (UID: "4a76866d-35bf-4dee-8fc4-a5c018e9edce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.048017 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a76866d-35bf-4dee-8fc4-a5c018e9edce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a76866d-35bf-4dee-8fc4-a5c018e9edce" (UID: "4a76866d-35bf-4dee-8fc4-a5c018e9edce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.100721 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a76866d-35bf-4dee-8fc4-a5c018e9edce-config-data" (OuterVolumeSpecName: "config-data") pod "4a76866d-35bf-4dee-8fc4-a5c018e9edce" (UID: "4a76866d-35bf-4dee-8fc4-a5c018e9edce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.111304 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a76866d-35bf-4dee-8fc4-a5c018e9edce-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.111343 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a76866d-35bf-4dee-8fc4-a5c018e9edce-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.111356 4696 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a76866d-35bf-4dee-8fc4-a5c018e9edce-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.111368 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kwjw\" (UniqueName: \"kubernetes.io/projected/4a76866d-35bf-4dee-8fc4-a5c018e9edce-kube-api-access-5kwjw\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.111379 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a76866d-35bf-4dee-8fc4-a5c018e9edce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.225985 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.323190 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-nl7sv"] Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.323543 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" podUID="8531ccd7-8b34-4e09-8b34-1fcf88234f42" containerName="dnsmasq-dns" containerID="cri-o://4d6ef86954abcda51f97b199a55ccc724445812ec71a1b1ae6b413fff466328f" gracePeriod=10 Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.453927 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-v2tzn" event={"ID":"4a76866d-35bf-4dee-8fc4-a5c018e9edce","Type":"ContainerDied","Data":"bbf6acfd3cbedfeb767c101bda2de64cf09e1d4337d0fb84790ef7b805617731"} Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.453970 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbf6acfd3cbedfeb767c101bda2de64cf09e1d4337d0fb84790ef7b805617731" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.454039 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-v2tzn" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.824130 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 15:57:43 crc kubenswrapper[4696]: E0318 15:57:43.825419 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a76866d-35bf-4dee-8fc4-a5c018e9edce" containerName="cinder-db-sync" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.825579 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a76866d-35bf-4dee-8fc4-a5c018e9edce" containerName="cinder-db-sync" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.826103 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a76866d-35bf-4dee-8fc4-a5c018e9edce" containerName="cinder-db-sync" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.827976 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.830288 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.842407 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-rp4hm" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.842748 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.842869 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.842993 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.895780 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-b2tn8"] Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.897792 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.923644 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-b2tn8"] Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.946143 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfea904e-cdd4-44dd-9302-821e35f9ae4b-config-data\") pod \"cinder-scheduler-0\" (UID: \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\") " pod="openstack/cinder-scheduler-0" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.946567 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfea904e-cdd4-44dd-9302-821e35f9ae4b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\") " pod="openstack/cinder-scheduler-0" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.946657 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfea904e-cdd4-44dd-9302-821e35f9ae4b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\") " pod="openstack/cinder-scheduler-0" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.946748 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfea904e-cdd4-44dd-9302-821e35f9ae4b-scripts\") pod \"cinder-scheduler-0\" (UID: \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\") " pod="openstack/cinder-scheduler-0" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.947017 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gqt9\" (UniqueName: \"kubernetes.io/projected/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-kube-api-access-5gqt9\") pod \"dnsmasq-dns-6578955fd5-b2tn8\" (UID: \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\") " pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.947157 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfea904e-cdd4-44dd-9302-821e35f9ae4b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\") " pod="openstack/cinder-scheduler-0" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.947312 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-dns-svc\") pod \"dnsmasq-dns-6578955fd5-b2tn8\" (UID: \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\") " pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.947460 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-b2tn8\" (UID: \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\") " pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.947589 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-b2tn8\" (UID: \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\") " pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.947667 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbwk2\" (UniqueName: \"kubernetes.io/projected/bfea904e-cdd4-44dd-9302-821e35f9ae4b-kube-api-access-dbwk2\") pod \"cinder-scheduler-0\" (UID: \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\") " pod="openstack/cinder-scheduler-0" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.947736 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-b2tn8\" (UID: \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\") " pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" Mar 18 15:57:43 crc kubenswrapper[4696]: I0318 15:57:43.947822 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-config\") pod \"dnsmasq-dns-6578955fd5-b2tn8\" (UID: \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\") " pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.050289 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfea904e-cdd4-44dd-9302-821e35f9ae4b-config-data\") pod \"cinder-scheduler-0\" (UID: \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\") " pod="openstack/cinder-scheduler-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.050378 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfea904e-cdd4-44dd-9302-821e35f9ae4b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\") " pod="openstack/cinder-scheduler-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.050410 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfea904e-cdd4-44dd-9302-821e35f9ae4b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\") " pod="openstack/cinder-scheduler-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.050436 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfea904e-cdd4-44dd-9302-821e35f9ae4b-scripts\") pod \"cinder-scheduler-0\" (UID: \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\") " pod="openstack/cinder-scheduler-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.050574 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gqt9\" (UniqueName: \"kubernetes.io/projected/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-kube-api-access-5gqt9\") pod \"dnsmasq-dns-6578955fd5-b2tn8\" (UID: \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\") " pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.050619 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfea904e-cdd4-44dd-9302-821e35f9ae4b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\") " pod="openstack/cinder-scheduler-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.050671 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-dns-svc\") pod \"dnsmasq-dns-6578955fd5-b2tn8\" (UID: \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\") " pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.050717 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-b2tn8\" (UID: \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\") " pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.050749 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbwk2\" (UniqueName: \"kubernetes.io/projected/bfea904e-cdd4-44dd-9302-821e35f9ae4b-kube-api-access-dbwk2\") pod \"cinder-scheduler-0\" (UID: \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\") " pod="openstack/cinder-scheduler-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.050771 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-b2tn8\" (UID: \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\") " pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.050798 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-b2tn8\" (UID: \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\") " pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.050823 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-config\") pod \"dnsmasq-dns-6578955fd5-b2tn8\" (UID: \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\") " pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.052081 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-config\") pod \"dnsmasq-dns-6578955fd5-b2tn8\" (UID: \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\") " pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.052709 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-dns-svc\") pod \"dnsmasq-dns-6578955fd5-b2tn8\" (UID: \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\") " pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.053226 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-b2tn8\" (UID: \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\") " pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.054224 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-b2tn8\" (UID: \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\") " pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.054843 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-b2tn8\" (UID: \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\") " pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.065254 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfea904e-cdd4-44dd-9302-821e35f9ae4b-scripts\") pod \"cinder-scheduler-0\" (UID: \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\") " pod="openstack/cinder-scheduler-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.065337 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfea904e-cdd4-44dd-9302-821e35f9ae4b-config-data\") pod \"cinder-scheduler-0\" (UID: \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\") " pod="openstack/cinder-scheduler-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.068447 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfea904e-cdd4-44dd-9302-821e35f9ae4b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\") " pod="openstack/cinder-scheduler-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.073326 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfea904e-cdd4-44dd-9302-821e35f9ae4b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\") " pod="openstack/cinder-scheduler-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.075210 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfea904e-cdd4-44dd-9302-821e35f9ae4b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\") " pod="openstack/cinder-scheduler-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.092458 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbwk2\" (UniqueName: \"kubernetes.io/projected/bfea904e-cdd4-44dd-9302-821e35f9ae4b-kube-api-access-dbwk2\") pod \"cinder-scheduler-0\" (UID: \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\") " pod="openstack/cinder-scheduler-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.160548 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gqt9\" (UniqueName: \"kubernetes.io/projected/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-kube-api-access-5gqt9\") pod \"dnsmasq-dns-6578955fd5-b2tn8\" (UID: \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\") " pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.199313 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.212777 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.215846 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.220799 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.246408 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.246900 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.258173 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-logs\") pod \"cinder-api-0\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " pod="openstack/cinder-api-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.258491 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.258618 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-config-data\") pod \"cinder-api-0\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " pod="openstack/cinder-api-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.258715 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-config-data-custom\") pod \"cinder-api-0\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " pod="openstack/cinder-api-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.258900 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-scripts\") pod \"cinder-api-0\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " pod="openstack/cinder-api-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.259017 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsz9k\" (UniqueName: \"kubernetes.io/projected/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-kube-api-access-xsz9k\") pod \"cinder-api-0\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " pod="openstack/cinder-api-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.259100 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " pod="openstack/cinder-api-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.259205 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " pod="openstack/cinder-api-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.361445 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-config\") pod \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\" (UID: \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\") " Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.361669 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-dns-swift-storage-0\") pod \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\" (UID: \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\") " Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.361737 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-ovsdbserver-nb\") pod \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\" (UID: \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\") " Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.361794 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-dns-svc\") pod \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\" (UID: \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\") " Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.361845 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cqlr\" (UniqueName: \"kubernetes.io/projected/8531ccd7-8b34-4e09-8b34-1fcf88234f42-kube-api-access-4cqlr\") pod \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\" (UID: \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\") " Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.361955 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-ovsdbserver-sb\") pod \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\" (UID: \"8531ccd7-8b34-4e09-8b34-1fcf88234f42\") " Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.362309 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-scripts\") pod \"cinder-api-0\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " pod="openstack/cinder-api-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.362371 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsz9k\" (UniqueName: \"kubernetes.io/projected/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-kube-api-access-xsz9k\") pod \"cinder-api-0\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " pod="openstack/cinder-api-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.362392 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " pod="openstack/cinder-api-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.362417 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " pod="openstack/cinder-api-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.362459 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-logs\") pod \"cinder-api-0\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " pod="openstack/cinder-api-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.362487 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-config-data\") pod \"cinder-api-0\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " pod="openstack/cinder-api-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.362505 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-config-data-custom\") pod \"cinder-api-0\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " pod="openstack/cinder-api-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.367694 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " pod="openstack/cinder-api-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.371683 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8531ccd7-8b34-4e09-8b34-1fcf88234f42-kube-api-access-4cqlr" (OuterVolumeSpecName: "kube-api-access-4cqlr") pod "8531ccd7-8b34-4e09-8b34-1fcf88234f42" (UID: "8531ccd7-8b34-4e09-8b34-1fcf88234f42"). InnerVolumeSpecName "kube-api-access-4cqlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.379464 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-config-data-custom\") pod \"cinder-api-0\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " pod="openstack/cinder-api-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.379873 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-logs\") pod \"cinder-api-0\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " pod="openstack/cinder-api-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.400613 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " pod="openstack/cinder-api-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.401065 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-scripts\") pod \"cinder-api-0\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " pod="openstack/cinder-api-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.409542 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-config-data\") pod \"cinder-api-0\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " pod="openstack/cinder-api-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.465340 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cqlr\" (UniqueName: \"kubernetes.io/projected/8531ccd7-8b34-4e09-8b34-1fcf88234f42-kube-api-access-4cqlr\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.482987 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsz9k\" (UniqueName: \"kubernetes.io/projected/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-kube-api-access-xsz9k\") pod \"cinder-api-0\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " pod="openstack/cinder-api-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.491448 4696 generic.go:334] "Generic (PLEG): container finished" podID="8531ccd7-8b34-4e09-8b34-1fcf88234f42" containerID="4d6ef86954abcda51f97b199a55ccc724445812ec71a1b1ae6b413fff466328f" exitCode=0 Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.491509 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" event={"ID":"8531ccd7-8b34-4e09-8b34-1fcf88234f42","Type":"ContainerDied","Data":"4d6ef86954abcda51f97b199a55ccc724445812ec71a1b1ae6b413fff466328f"} Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.491568 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" event={"ID":"8531ccd7-8b34-4e09-8b34-1fcf88234f42","Type":"ContainerDied","Data":"7e337a4206dc70b633f2b9537d4104b7b2860411e89dc5609cbf2b721c487bac"} Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.491582 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-nl7sv" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.491612 4696 scope.go:117] "RemoveContainer" containerID="4d6ef86954abcda51f97b199a55ccc724445812ec71a1b1ae6b413fff466328f" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.511459 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8531ccd7-8b34-4e09-8b34-1fcf88234f42" (UID: "8531ccd7-8b34-4e09-8b34-1fcf88234f42"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.524413 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8531ccd7-8b34-4e09-8b34-1fcf88234f42" (UID: "8531ccd7-8b34-4e09-8b34-1fcf88234f42"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.540617 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8531ccd7-8b34-4e09-8b34-1fcf88234f42" (UID: "8531ccd7-8b34-4e09-8b34-1fcf88234f42"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.540981 4696 scope.go:117] "RemoveContainer" containerID="33e988adcd07011082b7f2e4f41a3ca490cf7b314a1ab44bed9c0d8e17770a42" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.542253 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-config" (OuterVolumeSpecName: "config") pod "8531ccd7-8b34-4e09-8b34-1fcf88234f42" (UID: "8531ccd7-8b34-4e09-8b34-1fcf88234f42"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.569910 4696 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.570202 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.570215 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.570231 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.578283 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.588184 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8531ccd7-8b34-4e09-8b34-1fcf88234f42" (UID: "8531ccd7-8b34-4e09-8b34-1fcf88234f42"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.605869 4696 scope.go:117] "RemoveContainer" containerID="4d6ef86954abcda51f97b199a55ccc724445812ec71a1b1ae6b413fff466328f" Mar 18 15:57:44 crc kubenswrapper[4696]: E0318 15:57:44.615598 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d6ef86954abcda51f97b199a55ccc724445812ec71a1b1ae6b413fff466328f\": container with ID starting with 4d6ef86954abcda51f97b199a55ccc724445812ec71a1b1ae6b413fff466328f not found: ID does not exist" containerID="4d6ef86954abcda51f97b199a55ccc724445812ec71a1b1ae6b413fff466328f" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.615659 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d6ef86954abcda51f97b199a55ccc724445812ec71a1b1ae6b413fff466328f"} err="failed to get container status \"4d6ef86954abcda51f97b199a55ccc724445812ec71a1b1ae6b413fff466328f\": rpc error: code = NotFound desc = could not find container \"4d6ef86954abcda51f97b199a55ccc724445812ec71a1b1ae6b413fff466328f\": container with ID starting with 4d6ef86954abcda51f97b199a55ccc724445812ec71a1b1ae6b413fff466328f not found: ID does not exist" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.615694 4696 scope.go:117] "RemoveContainer" containerID="33e988adcd07011082b7f2e4f41a3ca490cf7b314a1ab44bed9c0d8e17770a42" Mar 18 15:57:44 crc kubenswrapper[4696]: E0318 15:57:44.617866 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33e988adcd07011082b7f2e4f41a3ca490cf7b314a1ab44bed9c0d8e17770a42\": container with ID starting with 33e988adcd07011082b7f2e4f41a3ca490cf7b314a1ab44bed9c0d8e17770a42 not found: ID does not exist" containerID="33e988adcd07011082b7f2e4f41a3ca490cf7b314a1ab44bed9c0d8e17770a42" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.617897 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33e988adcd07011082b7f2e4f41a3ca490cf7b314a1ab44bed9c0d8e17770a42"} err="failed to get container status \"33e988adcd07011082b7f2e4f41a3ca490cf7b314a1ab44bed9c0d8e17770a42\": rpc error: code = NotFound desc = could not find container \"33e988adcd07011082b7f2e4f41a3ca490cf7b314a1ab44bed9c0d8e17770a42\": container with ID starting with 33e988adcd07011082b7f2e4f41a3ca490cf7b314a1ab44bed9c0d8e17770a42 not found: ID does not exist" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.678132 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8531ccd7-8b34-4e09-8b34-1fcf88234f42-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.886095 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-nl7sv"] Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.912102 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-nl7sv"] Mar 18 15:57:44 crc kubenswrapper[4696]: I0318 15:57:44.943228 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 15:57:45 crc kubenswrapper[4696]: I0318 15:57:45.078767 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-b2tn8"] Mar 18 15:57:45 crc kubenswrapper[4696]: I0318 15:57:45.208214 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 15:57:45 crc kubenswrapper[4696]: W0318 15:57:45.213171 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode844127a_dbf6_4ba0_a6f3_e1c0a0b1b864.slice/crio-5532f254c9c639b8e924f994b90fb2d1a78be8c7fc303205536317c0b329d264 WatchSource:0}: Error finding container 5532f254c9c639b8e924f994b90fb2d1a78be8c7fc303205536317c0b329d264: Status 404 returned error can't find the container with id 5532f254c9c639b8e924f994b90fb2d1a78be8c7fc303205536317c0b329d264 Mar 18 15:57:45 crc kubenswrapper[4696]: I0318 15:57:45.518706 4696 generic.go:334] "Generic (PLEG): container finished" podID="25e269f9-a3d2-48a3-ac5b-dcdc18a31107" containerID="f6ee94ca50e28aff25324321004feee98995fb655e66703e42ad66981b6b8600" exitCode=0 Mar 18 15:57:45 crc kubenswrapper[4696]: I0318 15:57:45.519546 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" event={"ID":"25e269f9-a3d2-48a3-ac5b-dcdc18a31107","Type":"ContainerDied","Data":"f6ee94ca50e28aff25324321004feee98995fb655e66703e42ad66981b6b8600"} Mar 18 15:57:45 crc kubenswrapper[4696]: I0318 15:57:45.519638 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" event={"ID":"25e269f9-a3d2-48a3-ac5b-dcdc18a31107","Type":"ContainerStarted","Data":"8ce0f5b33490c37fed3bdf60182c807607e378dfc1b1ea70b4a413df73a33872"} Mar 18 15:57:45 crc kubenswrapper[4696]: I0318 15:57:45.549019 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bfea904e-cdd4-44dd-9302-821e35f9ae4b","Type":"ContainerStarted","Data":"6c35031aca774df67f1b0900ec6df679547841ff0abf5239b3354e08c4cb74da"} Mar 18 15:57:45 crc kubenswrapper[4696]: I0318 15:57:45.551714 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864","Type":"ContainerStarted","Data":"5532f254c9c639b8e924f994b90fb2d1a78be8c7fc303205536317c0b329d264"} Mar 18 15:57:45 crc kubenswrapper[4696]: I0318 15:57:45.619797 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8531ccd7-8b34-4e09-8b34-1fcf88234f42" path="/var/lib/kubelet/pods/8531ccd7-8b34-4e09-8b34-1fcf88234f42/volumes" Mar 18 15:57:45 crc kubenswrapper[4696]: I0318 15:57:45.656345 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6585944b98-nd6qg" Mar 18 15:57:46 crc kubenswrapper[4696]: I0318 15:57:46.036926 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 15:57:46 crc kubenswrapper[4696]: I0318 15:57:46.037641 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 15:57:46 crc kubenswrapper[4696]: I0318 15:57:46.109473 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 15:57:46 crc kubenswrapper[4696]: I0318 15:57:46.131897 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 15:57:46 crc kubenswrapper[4696]: I0318 15:57:46.141179 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6585944b98-nd6qg" Mar 18 15:57:46 crc kubenswrapper[4696]: I0318 15:57:46.558970 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 15:57:46 crc kubenswrapper[4696]: I0318 15:57:46.599252 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864","Type":"ContainerStarted","Data":"1b1cf9a0684cbdec247bfbc650b74e63666f1f913629a845baa493b0cc8f2cf6"} Mar 18 15:57:46 crc kubenswrapper[4696]: I0318 15:57:46.601238 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-59764c649b-dxxpb" podUID="abd090d6-037c-4cc7-907a-43293ce636ff" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Mar 18 15:57:46 crc kubenswrapper[4696]: I0318 15:57:46.608876 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" event={"ID":"25e269f9-a3d2-48a3-ac5b-dcdc18a31107","Type":"ContainerStarted","Data":"8b584819fe9014bb22195965686d5a101df9b02fefcd55614758091106e8ffb5"} Mar 18 15:57:46 crc kubenswrapper[4696]: I0318 15:57:46.608932 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" Mar 18 15:57:46 crc kubenswrapper[4696]: I0318 15:57:46.609511 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 15:57:46 crc kubenswrapper[4696]: I0318 15:57:46.609583 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 15:57:46 crc kubenswrapper[4696]: I0318 15:57:46.639742 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" podStartSLOduration=3.639714081 podStartE2EDuration="3.639714081s" podCreationTimestamp="2026-03-18 15:57:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:46.634229713 +0000 UTC m=+1309.640403929" watchObservedRunningTime="2026-03-18 15:57:46.639714081 +0000 UTC m=+1309.645888287" Mar 18 15:57:47 crc kubenswrapper[4696]: I0318 15:57:47.660985 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bfea904e-cdd4-44dd-9302-821e35f9ae4b","Type":"ContainerStarted","Data":"e07f90e917ed403a021d179eb321ac1d826b01891d8089ca3dee69376f8a5152"} Mar 18 15:57:47 crc kubenswrapper[4696]: I0318 15:57:47.706829 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864","Type":"ContainerStarted","Data":"5c32ae0538d81f24fcec99902d41c5c2d19e2580f9c9992f36dccfd183d96d6b"} Mar 18 15:57:47 crc kubenswrapper[4696]: I0318 15:57:47.708725 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864" containerName="cinder-api-log" containerID="cri-o://1b1cf9a0684cbdec247bfbc650b74e63666f1f913629a845baa493b0cc8f2cf6" gracePeriod=30 Mar 18 15:57:47 crc kubenswrapper[4696]: I0318 15:57:47.708927 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 15:57:47 crc kubenswrapper[4696]: I0318 15:57:47.709354 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864" containerName="cinder-api" containerID="cri-o://5c32ae0538d81f24fcec99902d41c5c2d19e2580f9c9992f36dccfd183d96d6b" gracePeriod=30 Mar 18 15:57:47 crc kubenswrapper[4696]: I0318 15:57:47.735813 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.735792084 podStartE2EDuration="3.735792084s" podCreationTimestamp="2026-03-18 15:57:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:47.734065891 +0000 UTC m=+1310.740240097" watchObservedRunningTime="2026-03-18 15:57:47.735792084 +0000 UTC m=+1310.741966290" Mar 18 15:57:48 crc kubenswrapper[4696]: I0318 15:57:48.727478 4696 generic.go:334] "Generic (PLEG): container finished" podID="e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864" containerID="5c32ae0538d81f24fcec99902d41c5c2d19e2580f9c9992f36dccfd183d96d6b" exitCode=0 Mar 18 15:57:48 crc kubenswrapper[4696]: I0318 15:57:48.727867 4696 generic.go:334] "Generic (PLEG): container finished" podID="e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864" containerID="1b1cf9a0684cbdec247bfbc650b74e63666f1f913629a845baa493b0cc8f2cf6" exitCode=143 Mar 18 15:57:48 crc kubenswrapper[4696]: I0318 15:57:48.727917 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864","Type":"ContainerDied","Data":"5c32ae0538d81f24fcec99902d41c5c2d19e2580f9c9992f36dccfd183d96d6b"} Mar 18 15:57:48 crc kubenswrapper[4696]: I0318 15:57:48.727954 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864","Type":"ContainerDied","Data":"1b1cf9a0684cbdec247bfbc650b74e63666f1f913629a845baa493b0cc8f2cf6"} Mar 18 15:57:48 crc kubenswrapper[4696]: I0318 15:57:48.729866 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bfea904e-cdd4-44dd-9302-821e35f9ae4b","Type":"ContainerStarted","Data":"2023f016fe3002f77300c5ff55c43c244bfc9f82232eefe6b2109c2bc08e8b6b"} Mar 18 15:57:48 crc kubenswrapper[4696]: I0318 15:57:48.769216 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.6656771169999995 podStartE2EDuration="5.769183138s" podCreationTimestamp="2026-03-18 15:57:43 +0000 UTC" firstStartedPulling="2026-03-18 15:57:44.95713754 +0000 UTC m=+1307.963311736" lastFinishedPulling="2026-03-18 15:57:46.060643551 +0000 UTC m=+1309.066817757" observedRunningTime="2026-03-18 15:57:48.755108525 +0000 UTC m=+1311.761282721" watchObservedRunningTime="2026-03-18 15:57:48.769183138 +0000 UTC m=+1311.775357344" Mar 18 15:57:48 crc kubenswrapper[4696]: I0318 15:57:48.839527 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.013070 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-scripts\") pod \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.013193 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-config-data-custom\") pod \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.013327 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-combined-ca-bundle\") pod \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.013454 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-logs\") pod \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.013483 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsz9k\" (UniqueName: \"kubernetes.io/projected/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-kube-api-access-xsz9k\") pod \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.013573 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-config-data\") pod \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.013595 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-etc-machine-id\") pod \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\" (UID: \"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864\") " Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.014079 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864" (UID: "e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.014358 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-logs" (OuterVolumeSpecName: "logs") pod "e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864" (UID: "e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.027740 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-kube-api-access-xsz9k" (OuterVolumeSpecName: "kube-api-access-xsz9k") pod "e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864" (UID: "e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864"). InnerVolumeSpecName "kube-api-access-xsz9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.033731 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-scripts" (OuterVolumeSpecName: "scripts") pod "e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864" (UID: "e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.039735 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864" (UID: "e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.094706 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864" (UID: "e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.118199 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.118252 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-logs\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.118264 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsz9k\" (UniqueName: \"kubernetes.io/projected/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-kube-api-access-xsz9k\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.118277 4696 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.118285 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.118293 4696 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.129373 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-config-data" (OuterVolumeSpecName: "config-data") pod "e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864" (UID: "e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.200675 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.220086 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.424200 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.540431 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.683686 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.683841 4696 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.754922 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.755039 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864","Type":"ContainerDied","Data":"5532f254c9c639b8e924f994b90fb2d1a78be8c7fc303205536317c0b329d264"} Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.755089 4696 scope.go:117] "RemoveContainer" containerID="5c32ae0538d81f24fcec99902d41c5c2d19e2580f9c9992f36dccfd183d96d6b" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.795875 4696 scope.go:117] "RemoveContainer" containerID="1b1cf9a0684cbdec247bfbc650b74e63666f1f913629a845baa493b0cc8f2cf6" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.802109 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.821458 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.834606 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 15:57:49 crc kubenswrapper[4696]: E0318 15:57:49.835598 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864" containerName="cinder-api" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.835694 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864" containerName="cinder-api" Mar 18 15:57:49 crc kubenswrapper[4696]: E0318 15:57:49.835830 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864" containerName="cinder-api-log" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.835886 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864" containerName="cinder-api-log" Mar 18 15:57:49 crc kubenswrapper[4696]: E0318 15:57:49.835960 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8531ccd7-8b34-4e09-8b34-1fcf88234f42" containerName="init" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.836013 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="8531ccd7-8b34-4e09-8b34-1fcf88234f42" containerName="init" Mar 18 15:57:49 crc kubenswrapper[4696]: E0318 15:57:49.836598 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8531ccd7-8b34-4e09-8b34-1fcf88234f42" containerName="dnsmasq-dns" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.836667 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="8531ccd7-8b34-4e09-8b34-1fcf88234f42" containerName="dnsmasq-dns" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.836975 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864" containerName="cinder-api" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.837046 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="8531ccd7-8b34-4e09-8b34-1fcf88234f42" containerName="dnsmasq-dns" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.837106 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864" containerName="cinder-api-log" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.838420 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.839220 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.869023 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-756f5c8c54-wjfvb" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.879462 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.879925 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.880823 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.942982 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce02fef3-40fa-46fe-a496-0aada019e24b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.943032 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce02fef3-40fa-46fe-a496-0aada019e24b-logs\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.943078 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmzrp\" (UniqueName: \"kubernetes.io/projected/ce02fef3-40fa-46fe-a496-0aada019e24b-kube-api-access-bmzrp\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.943146 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce02fef3-40fa-46fe-a496-0aada019e24b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.943186 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce02fef3-40fa-46fe-a496-0aada019e24b-config-data\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.943243 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce02fef3-40fa-46fe-a496-0aada019e24b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.943293 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce02fef3-40fa-46fe-a496-0aada019e24b-config-data-custom\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.943309 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce02fef3-40fa-46fe-a496-0aada019e24b-scripts\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:49 crc kubenswrapper[4696]: I0318 15:57:49.943327 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce02fef3-40fa-46fe-a496-0aada019e24b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.007583 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6585944b98-nd6qg"] Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.008257 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6585944b98-nd6qg" podUID="086044f6-b566-472a-b30c-f710c801e907" containerName="barbican-api-log" containerID="cri-o://88c7dc156b455c2a08eca47c237b9054c7d58ca60c2cbe12d9199e8d0cfd87af" gracePeriod=30 Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.009098 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6585944b98-nd6qg" podUID="086044f6-b566-472a-b30c-f710c801e907" containerName="barbican-api" containerID="cri-o://2c1e1103e8d1f94761ac10822b7d945aaf9d9f6082020560fd58e25f71aebed5" gracePeriod=30 Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.045000 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce02fef3-40fa-46fe-a496-0aada019e24b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.045088 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce02fef3-40fa-46fe-a496-0aada019e24b-config-data-custom\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.045110 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce02fef3-40fa-46fe-a496-0aada019e24b-scripts\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.045132 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce02fef3-40fa-46fe-a496-0aada019e24b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.045166 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce02fef3-40fa-46fe-a496-0aada019e24b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.045184 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce02fef3-40fa-46fe-a496-0aada019e24b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.045278 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce02fef3-40fa-46fe-a496-0aada019e24b-logs\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.045423 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmzrp\" (UniqueName: \"kubernetes.io/projected/ce02fef3-40fa-46fe-a496-0aada019e24b-kube-api-access-bmzrp\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.045620 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce02fef3-40fa-46fe-a496-0aada019e24b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.045733 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce02fef3-40fa-46fe-a496-0aada019e24b-config-data\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.056511 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce02fef3-40fa-46fe-a496-0aada019e24b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.059046 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce02fef3-40fa-46fe-a496-0aada019e24b-logs\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.065780 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce02fef3-40fa-46fe-a496-0aada019e24b-config-data\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.076208 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce02fef3-40fa-46fe-a496-0aada019e24b-scripts\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.079782 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce02fef3-40fa-46fe-a496-0aada019e24b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.082034 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmzrp\" (UniqueName: \"kubernetes.io/projected/ce02fef3-40fa-46fe-a496-0aada019e24b-kube-api-access-bmzrp\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.083328 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce02fef3-40fa-46fe-a496-0aada019e24b-config-data-custom\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.095120 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce02fef3-40fa-46fe-a496-0aada019e24b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ce02fef3-40fa-46fe-a496-0aada019e24b\") " pod="openstack/cinder-api-0" Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.191655 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.197643 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.742879 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6fd69d9b-g5mkf" Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.823184 4696 generic.go:334] "Generic (PLEG): container finished" podID="086044f6-b566-472a-b30c-f710c801e907" containerID="88c7dc156b455c2a08eca47c237b9054c7d58ca60c2cbe12d9199e8d0cfd87af" exitCode=143 Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.823320 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6585944b98-nd6qg" event={"ID":"086044f6-b566-472a-b30c-f710c801e907","Type":"ContainerDied","Data":"88c7dc156b455c2a08eca47c237b9054c7d58ca60c2cbe12d9199e8d0cfd87af"} Mar 18 15:57:50 crc kubenswrapper[4696]: I0318 15:57:50.893809 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.012452 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5c74b66c77-jb2xr"] Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.013266 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5c74b66c77-jb2xr" podUID="2ca71182-1195-450d-a81b-b6db4dff526e" containerName="neutron-httpd" containerID="cri-o://ac2b44742300d1eff1d4804b2a7d6d37a063c52cd9d0dc98db77042e8e099615" gracePeriod=30 Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.013596 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5c74b66c77-jb2xr" podUID="2ca71182-1195-450d-a81b-b6db4dff526e" containerName="neutron-api" containerID="cri-o://9185425688b7b2eb16da486248052e4f66191948466320dfa616f676c6d38da4" gracePeriod=30 Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.022478 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5c74b66c77-jb2xr" podUID="2ca71182-1195-450d-a81b-b6db4dff526e" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9696/\": EOF" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.042727 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-667fb94989-br52g"] Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.045973 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-667fb94989-br52g" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.070191 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-667fb94989-br52g"] Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.199199 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/13f1595d-6eb1-41a2-8cd9-12d80a38303f-config\") pod \"neutron-667fb94989-br52g\" (UID: \"13f1595d-6eb1-41a2-8cd9-12d80a38303f\") " pod="openstack/neutron-667fb94989-br52g" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.199327 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f1595d-6eb1-41a2-8cd9-12d80a38303f-combined-ca-bundle\") pod \"neutron-667fb94989-br52g\" (UID: \"13f1595d-6eb1-41a2-8cd9-12d80a38303f\") " pod="openstack/neutron-667fb94989-br52g" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.199406 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/13f1595d-6eb1-41a2-8cd9-12d80a38303f-ovndb-tls-certs\") pod \"neutron-667fb94989-br52g\" (UID: \"13f1595d-6eb1-41a2-8cd9-12d80a38303f\") " pod="openstack/neutron-667fb94989-br52g" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.199442 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13f1595d-6eb1-41a2-8cd9-12d80a38303f-public-tls-certs\") pod \"neutron-667fb94989-br52g\" (UID: \"13f1595d-6eb1-41a2-8cd9-12d80a38303f\") " pod="openstack/neutron-667fb94989-br52g" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.199493 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13f1595d-6eb1-41a2-8cd9-12d80a38303f-internal-tls-certs\") pod \"neutron-667fb94989-br52g\" (UID: \"13f1595d-6eb1-41a2-8cd9-12d80a38303f\") " pod="openstack/neutron-667fb94989-br52g" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.199569 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g9sk\" (UniqueName: \"kubernetes.io/projected/13f1595d-6eb1-41a2-8cd9-12d80a38303f-kube-api-access-7g9sk\") pod \"neutron-667fb94989-br52g\" (UID: \"13f1595d-6eb1-41a2-8cd9-12d80a38303f\") " pod="openstack/neutron-667fb94989-br52g" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.199598 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/13f1595d-6eb1-41a2-8cd9-12d80a38303f-httpd-config\") pod \"neutron-667fb94989-br52g\" (UID: \"13f1595d-6eb1-41a2-8cd9-12d80a38303f\") " pod="openstack/neutron-667fb94989-br52g" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.301274 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g9sk\" (UniqueName: \"kubernetes.io/projected/13f1595d-6eb1-41a2-8cd9-12d80a38303f-kube-api-access-7g9sk\") pod \"neutron-667fb94989-br52g\" (UID: \"13f1595d-6eb1-41a2-8cd9-12d80a38303f\") " pod="openstack/neutron-667fb94989-br52g" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.301333 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/13f1595d-6eb1-41a2-8cd9-12d80a38303f-httpd-config\") pod \"neutron-667fb94989-br52g\" (UID: \"13f1595d-6eb1-41a2-8cd9-12d80a38303f\") " pod="openstack/neutron-667fb94989-br52g" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.301380 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/13f1595d-6eb1-41a2-8cd9-12d80a38303f-config\") pod \"neutron-667fb94989-br52g\" (UID: \"13f1595d-6eb1-41a2-8cd9-12d80a38303f\") " pod="openstack/neutron-667fb94989-br52g" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.301411 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f1595d-6eb1-41a2-8cd9-12d80a38303f-combined-ca-bundle\") pod \"neutron-667fb94989-br52g\" (UID: \"13f1595d-6eb1-41a2-8cd9-12d80a38303f\") " pod="openstack/neutron-667fb94989-br52g" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.301469 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/13f1595d-6eb1-41a2-8cd9-12d80a38303f-ovndb-tls-certs\") pod \"neutron-667fb94989-br52g\" (UID: \"13f1595d-6eb1-41a2-8cd9-12d80a38303f\") " pod="openstack/neutron-667fb94989-br52g" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.301509 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13f1595d-6eb1-41a2-8cd9-12d80a38303f-public-tls-certs\") pod \"neutron-667fb94989-br52g\" (UID: \"13f1595d-6eb1-41a2-8cd9-12d80a38303f\") " pod="openstack/neutron-667fb94989-br52g" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.301586 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13f1595d-6eb1-41a2-8cd9-12d80a38303f-internal-tls-certs\") pod \"neutron-667fb94989-br52g\" (UID: \"13f1595d-6eb1-41a2-8cd9-12d80a38303f\") " pod="openstack/neutron-667fb94989-br52g" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.311479 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13f1595d-6eb1-41a2-8cd9-12d80a38303f-internal-tls-certs\") pod \"neutron-667fb94989-br52g\" (UID: \"13f1595d-6eb1-41a2-8cd9-12d80a38303f\") " pod="openstack/neutron-667fb94989-br52g" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.312323 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/13f1595d-6eb1-41a2-8cd9-12d80a38303f-config\") pod \"neutron-667fb94989-br52g\" (UID: \"13f1595d-6eb1-41a2-8cd9-12d80a38303f\") " pod="openstack/neutron-667fb94989-br52g" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.312767 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/13f1595d-6eb1-41a2-8cd9-12d80a38303f-ovndb-tls-certs\") pod \"neutron-667fb94989-br52g\" (UID: \"13f1595d-6eb1-41a2-8cd9-12d80a38303f\") " pod="openstack/neutron-667fb94989-br52g" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.312778 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13f1595d-6eb1-41a2-8cd9-12d80a38303f-public-tls-certs\") pod \"neutron-667fb94989-br52g\" (UID: \"13f1595d-6eb1-41a2-8cd9-12d80a38303f\") " pod="openstack/neutron-667fb94989-br52g" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.324176 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g9sk\" (UniqueName: \"kubernetes.io/projected/13f1595d-6eb1-41a2-8cd9-12d80a38303f-kube-api-access-7g9sk\") pod \"neutron-667fb94989-br52g\" (UID: \"13f1595d-6eb1-41a2-8cd9-12d80a38303f\") " pod="openstack/neutron-667fb94989-br52g" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.324229 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/13f1595d-6eb1-41a2-8cd9-12d80a38303f-httpd-config\") pod \"neutron-667fb94989-br52g\" (UID: \"13f1595d-6eb1-41a2-8cd9-12d80a38303f\") " pod="openstack/neutron-667fb94989-br52g" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.324511 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f1595d-6eb1-41a2-8cd9-12d80a38303f-combined-ca-bundle\") pod \"neutron-667fb94989-br52g\" (UID: \"13f1595d-6eb1-41a2-8cd9-12d80a38303f\") " pod="openstack/neutron-667fb94989-br52g" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.460549 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-667fb94989-br52g" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.617707 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864" path="/var/lib/kubelet/pods/e844127a-dbf6-4ba0-a6f3-e1c0a0b1b864/volumes" Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.847360 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ce02fef3-40fa-46fe-a496-0aada019e24b","Type":"ContainerStarted","Data":"7654532ce6b402d32c3f14f475bf64c7eba8204d4bc938de6527e56a32a5c37d"} Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.875016 4696 generic.go:334] "Generic (PLEG): container finished" podID="2ca71182-1195-450d-a81b-b6db4dff526e" containerID="ac2b44742300d1eff1d4804b2a7d6d37a063c52cd9d0dc98db77042e8e099615" exitCode=0 Mar 18 15:57:51 crc kubenswrapper[4696]: I0318 15:57:51.875093 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c74b66c77-jb2xr" event={"ID":"2ca71182-1195-450d-a81b-b6db4dff526e","Type":"ContainerDied","Data":"ac2b44742300d1eff1d4804b2a7d6d37a063c52cd9d0dc98db77042e8e099615"} Mar 18 15:57:52 crc kubenswrapper[4696]: I0318 15:57:52.111595 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:57:52 crc kubenswrapper[4696]: I0318 15:57:52.236290 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-667fb94989-br52g"] Mar 18 15:57:52 crc kubenswrapper[4696]: I0318 15:57:52.904556 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ce02fef3-40fa-46fe-a496-0aada019e24b","Type":"ContainerStarted","Data":"d130ddc215569c8d339678b86c6b6c178bb98cde10efbffe7960c77bc02c22ec"} Mar 18 15:57:52 crc kubenswrapper[4696]: I0318 15:57:52.914422 4696 generic.go:334] "Generic (PLEG): container finished" podID="2ca71182-1195-450d-a81b-b6db4dff526e" containerID="9185425688b7b2eb16da486248052e4f66191948466320dfa616f676c6d38da4" exitCode=0 Mar 18 15:57:52 crc kubenswrapper[4696]: I0318 15:57:52.914629 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c74b66c77-jb2xr" event={"ID":"2ca71182-1195-450d-a81b-b6db4dff526e","Type":"ContainerDied","Data":"9185425688b7b2eb16da486248052e4f66191948466320dfa616f676c6d38da4"} Mar 18 15:57:52 crc kubenswrapper[4696]: I0318 15:57:52.923847 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-667fb94989-br52g" event={"ID":"13f1595d-6eb1-41a2-8cd9-12d80a38303f","Type":"ContainerStarted","Data":"38ca6bdffd3e05179cd91ef5b31efcb38aea1fc53fdcbb6705efdfdded7d89b7"} Mar 18 15:57:52 crc kubenswrapper[4696]: I0318 15:57:52.923909 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-667fb94989-br52g" event={"ID":"13f1595d-6eb1-41a2-8cd9-12d80a38303f","Type":"ContainerStarted","Data":"183f6a4fdc4fd875130bc0e3d40d1860e9699a52af730a4544d61aa52b338063"} Mar 18 15:57:52 crc kubenswrapper[4696]: I0318 15:57:52.959052 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.071934 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-httpd-config\") pod \"2ca71182-1195-450d-a81b-b6db4dff526e\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.073969 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98rrm\" (UniqueName: \"kubernetes.io/projected/2ca71182-1195-450d-a81b-b6db4dff526e-kube-api-access-98rrm\") pod \"2ca71182-1195-450d-a81b-b6db4dff526e\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.074134 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-internal-tls-certs\") pod \"2ca71182-1195-450d-a81b-b6db4dff526e\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.074932 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-config\") pod \"2ca71182-1195-450d-a81b-b6db4dff526e\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.075027 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-ovndb-tls-certs\") pod \"2ca71182-1195-450d-a81b-b6db4dff526e\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.075123 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-combined-ca-bundle\") pod \"2ca71182-1195-450d-a81b-b6db4dff526e\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.075458 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-public-tls-certs\") pod \"2ca71182-1195-450d-a81b-b6db4dff526e\" (UID: \"2ca71182-1195-450d-a81b-b6db4dff526e\") " Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.080091 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ca71182-1195-450d-a81b-b6db4dff526e-kube-api-access-98rrm" (OuterVolumeSpecName: "kube-api-access-98rrm") pod "2ca71182-1195-450d-a81b-b6db4dff526e" (UID: "2ca71182-1195-450d-a81b-b6db4dff526e"). InnerVolumeSpecName "kube-api-access-98rrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.081790 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2ca71182-1195-450d-a81b-b6db4dff526e" (UID: "2ca71182-1195-450d-a81b-b6db4dff526e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.138957 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2ca71182-1195-450d-a81b-b6db4dff526e" (UID: "2ca71182-1195-450d-a81b-b6db4dff526e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.145835 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2ca71182-1195-450d-a81b-b6db4dff526e" (UID: "2ca71182-1195-450d-a81b-b6db4dff526e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.168721 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ca71182-1195-450d-a81b-b6db4dff526e" (UID: "2ca71182-1195-450d-a81b-b6db4dff526e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.181275 4696 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.181320 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.181330 4696 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.181340 4696 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.181350 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98rrm\" (UniqueName: \"kubernetes.io/projected/2ca71182-1195-450d-a81b-b6db4dff526e-kube-api-access-98rrm\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.205614 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-config" (OuterVolumeSpecName: "config") pod "2ca71182-1195-450d-a81b-b6db4dff526e" (UID: "2ca71182-1195-450d-a81b-b6db4dff526e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.220624 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "2ca71182-1195-450d-a81b-b6db4dff526e" (UID: "2ca71182-1195-450d-a81b-b6db4dff526e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.271585 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6585944b98-nd6qg" podUID="086044f6-b566-472a-b30c-f710c801e907" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": dial tcp 10.217.0.165:9311: connect: connection refused" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.271714 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6585944b98-nd6qg" podUID="086044f6-b566-472a-b30c-f710c801e907" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": dial tcp 10.217.0.165:9311: connect: connection refused" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.283411 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.283461 4696 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ca71182-1195-450d-a81b-b6db4dff526e-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.701501 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6585944b98-nd6qg" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.795452 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nn9d\" (UniqueName: \"kubernetes.io/projected/086044f6-b566-472a-b30c-f710c801e907-kube-api-access-9nn9d\") pod \"086044f6-b566-472a-b30c-f710c801e907\" (UID: \"086044f6-b566-472a-b30c-f710c801e907\") " Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.795759 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/086044f6-b566-472a-b30c-f710c801e907-logs\") pod \"086044f6-b566-472a-b30c-f710c801e907\" (UID: \"086044f6-b566-472a-b30c-f710c801e907\") " Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.795873 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/086044f6-b566-472a-b30c-f710c801e907-config-data-custom\") pod \"086044f6-b566-472a-b30c-f710c801e907\" (UID: \"086044f6-b566-472a-b30c-f710c801e907\") " Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.795917 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/086044f6-b566-472a-b30c-f710c801e907-config-data\") pod \"086044f6-b566-472a-b30c-f710c801e907\" (UID: \"086044f6-b566-472a-b30c-f710c801e907\") " Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.795992 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086044f6-b566-472a-b30c-f710c801e907-combined-ca-bundle\") pod \"086044f6-b566-472a-b30c-f710c801e907\" (UID: \"086044f6-b566-472a-b30c-f710c801e907\") " Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.796374 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/086044f6-b566-472a-b30c-f710c801e907-logs" (OuterVolumeSpecName: "logs") pod "086044f6-b566-472a-b30c-f710c801e907" (UID: "086044f6-b566-472a-b30c-f710c801e907"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.797494 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/086044f6-b566-472a-b30c-f710c801e907-logs\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.802444 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086044f6-b566-472a-b30c-f710c801e907-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "086044f6-b566-472a-b30c-f710c801e907" (UID: "086044f6-b566-472a-b30c-f710c801e907"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.802839 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/086044f6-b566-472a-b30c-f710c801e907-kube-api-access-9nn9d" (OuterVolumeSpecName: "kube-api-access-9nn9d") pod "086044f6-b566-472a-b30c-f710c801e907" (UID: "086044f6-b566-472a-b30c-f710c801e907"). InnerVolumeSpecName "kube-api-access-9nn9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.838240 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086044f6-b566-472a-b30c-f710c801e907-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "086044f6-b566-472a-b30c-f710c801e907" (UID: "086044f6-b566-472a-b30c-f710c801e907"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.857606 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086044f6-b566-472a-b30c-f710c801e907-config-data" (OuterVolumeSpecName: "config-data") pod "086044f6-b566-472a-b30c-f710c801e907" (UID: "086044f6-b566-472a-b30c-f710c801e907"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.901596 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nn9d\" (UniqueName: \"kubernetes.io/projected/086044f6-b566-472a-b30c-f710c801e907-kube-api-access-9nn9d\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.901646 4696 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/086044f6-b566-472a-b30c-f710c801e907-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.901663 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/086044f6-b566-472a-b30c-f710c801e907-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.901677 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086044f6-b566-472a-b30c-f710c801e907-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.943268 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c74b66c77-jb2xr" event={"ID":"2ca71182-1195-450d-a81b-b6db4dff526e","Type":"ContainerDied","Data":"b0a48473c7224e929b493d6003ffc7be6a78bfd513e4bde79181d3cbb945b700"} Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.943326 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c74b66c77-jb2xr" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.943376 4696 scope.go:117] "RemoveContainer" containerID="ac2b44742300d1eff1d4804b2a7d6d37a063c52cd9d0dc98db77042e8e099615" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.952877 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ce02fef3-40fa-46fe-a496-0aada019e24b","Type":"ContainerStarted","Data":"5a2002b0411126031739f9007793293244d384545823a68f7707fb8c5ed1b8dd"} Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.953193 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.956601 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-667fb94989-br52g" event={"ID":"13f1595d-6eb1-41a2-8cd9-12d80a38303f","Type":"ContainerStarted","Data":"5bbe49ba71dce42ea973d06d0918930182990a6450e55c5823ca66fef00ddd1c"} Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.957178 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-667fb94989-br52g" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.960357 4696 generic.go:334] "Generic (PLEG): container finished" podID="086044f6-b566-472a-b30c-f710c801e907" containerID="2c1e1103e8d1f94761ac10822b7d945aaf9d9f6082020560fd58e25f71aebed5" exitCode=0 Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.960393 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6585944b98-nd6qg" event={"ID":"086044f6-b566-472a-b30c-f710c801e907","Type":"ContainerDied","Data":"2c1e1103e8d1f94761ac10822b7d945aaf9d9f6082020560fd58e25f71aebed5"} Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.960437 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6585944b98-nd6qg" event={"ID":"086044f6-b566-472a-b30c-f710c801e907","Type":"ContainerDied","Data":"480a352bf79d9b9ed06c148db225ba22efe933a0581455483e6596794d516c7d"} Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.960478 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6585944b98-nd6qg" Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.978572 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5c74b66c77-jb2xr"] Mar 18 15:57:53 crc kubenswrapper[4696]: I0318 15:57:53.993307 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5c74b66c77-jb2xr"] Mar 18 15:57:54 crc kubenswrapper[4696]: I0318 15:57:53.998430 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.998396346 podStartE2EDuration="4.998396346s" podCreationTimestamp="2026-03-18 15:57:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:53.996475888 +0000 UTC m=+1317.002650104" watchObservedRunningTime="2026-03-18 15:57:53.998396346 +0000 UTC m=+1317.004570562" Mar 18 15:57:54 crc kubenswrapper[4696]: I0318 15:57:54.007197 4696 scope.go:117] "RemoveContainer" containerID="9185425688b7b2eb16da486248052e4f66191948466320dfa616f676c6d38da4" Mar 18 15:57:54 crc kubenswrapper[4696]: I0318 15:57:54.030491 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-667fb94989-br52g" podStartSLOduration=3.030455159 podStartE2EDuration="3.030455159s" podCreationTimestamp="2026-03-18 15:57:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:54.028380657 +0000 UTC m=+1317.034554993" watchObservedRunningTime="2026-03-18 15:57:54.030455159 +0000 UTC m=+1317.036629365" Mar 18 15:57:54 crc kubenswrapper[4696]: I0318 15:57:54.056238 4696 scope.go:117] "RemoveContainer" containerID="2c1e1103e8d1f94761ac10822b7d945aaf9d9f6082020560fd58e25f71aebed5" Mar 18 15:57:54 crc kubenswrapper[4696]: I0318 15:57:54.059754 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6585944b98-nd6qg"] Mar 18 15:57:54 crc kubenswrapper[4696]: I0318 15:57:54.069395 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6585944b98-nd6qg"] Mar 18 15:57:54 crc kubenswrapper[4696]: I0318 15:57:54.090650 4696 scope.go:117] "RemoveContainer" containerID="88c7dc156b455c2a08eca47c237b9054c7d58ca60c2cbe12d9199e8d0cfd87af" Mar 18 15:57:54 crc kubenswrapper[4696]: I0318 15:57:54.160902 4696 scope.go:117] "RemoveContainer" containerID="2c1e1103e8d1f94761ac10822b7d945aaf9d9f6082020560fd58e25f71aebed5" Mar 18 15:57:54 crc kubenswrapper[4696]: E0318 15:57:54.161678 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c1e1103e8d1f94761ac10822b7d945aaf9d9f6082020560fd58e25f71aebed5\": container with ID starting with 2c1e1103e8d1f94761ac10822b7d945aaf9d9f6082020560fd58e25f71aebed5 not found: ID does not exist" containerID="2c1e1103e8d1f94761ac10822b7d945aaf9d9f6082020560fd58e25f71aebed5" Mar 18 15:57:54 crc kubenswrapper[4696]: I0318 15:57:54.161749 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c1e1103e8d1f94761ac10822b7d945aaf9d9f6082020560fd58e25f71aebed5"} err="failed to get container status \"2c1e1103e8d1f94761ac10822b7d945aaf9d9f6082020560fd58e25f71aebed5\": rpc error: code = NotFound desc = could not find container \"2c1e1103e8d1f94761ac10822b7d945aaf9d9f6082020560fd58e25f71aebed5\": container with ID starting with 2c1e1103e8d1f94761ac10822b7d945aaf9d9f6082020560fd58e25f71aebed5 not found: ID does not exist" Mar 18 15:57:54 crc kubenswrapper[4696]: I0318 15:57:54.161792 4696 scope.go:117] "RemoveContainer" containerID="88c7dc156b455c2a08eca47c237b9054c7d58ca60c2cbe12d9199e8d0cfd87af" Mar 18 15:57:54 crc kubenswrapper[4696]: E0318 15:57:54.162190 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88c7dc156b455c2a08eca47c237b9054c7d58ca60c2cbe12d9199e8d0cfd87af\": container with ID starting with 88c7dc156b455c2a08eca47c237b9054c7d58ca60c2cbe12d9199e8d0cfd87af not found: ID does not exist" containerID="88c7dc156b455c2a08eca47c237b9054c7d58ca60c2cbe12d9199e8d0cfd87af" Mar 18 15:57:54 crc kubenswrapper[4696]: I0318 15:57:54.162216 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88c7dc156b455c2a08eca47c237b9054c7d58ca60c2cbe12d9199e8d0cfd87af"} err="failed to get container status \"88c7dc156b455c2a08eca47c237b9054c7d58ca60c2cbe12d9199e8d0cfd87af\": rpc error: code = NotFound desc = could not find container \"88c7dc156b455c2a08eca47c237b9054c7d58ca60c2cbe12d9199e8d0cfd87af\": container with ID starting with 88c7dc156b455c2a08eca47c237b9054c7d58ca60c2cbe12d9199e8d0cfd87af not found: ID does not exist" Mar 18 15:57:54 crc kubenswrapper[4696]: I0318 15:57:54.248761 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" Mar 18 15:57:54 crc kubenswrapper[4696]: I0318 15:57:54.325151 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-bsk8g"] Mar 18 15:57:54 crc kubenswrapper[4696]: I0318 15:57:54.325911 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" podUID="0ce74221-62af-412b-89b0-c0c77c52a866" containerName="dnsmasq-dns" containerID="cri-o://70ca343ae08649a3a352c24d3ffd4c61548152247d67bac2b2db5a7b44c16690" gracePeriod=10 Mar 18 15:57:54 crc kubenswrapper[4696]: I0318 15:57:54.450967 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 15:57:54 crc kubenswrapper[4696]: I0318 15:57:54.547705 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 15:57:54 crc kubenswrapper[4696]: I0318 15:57:54.917568 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" Mar 18 15:57:54 crc kubenswrapper[4696]: I0318 15:57:54.977814 4696 generic.go:334] "Generic (PLEG): container finished" podID="0ce74221-62af-412b-89b0-c0c77c52a866" containerID="70ca343ae08649a3a352c24d3ffd4c61548152247d67bac2b2db5a7b44c16690" exitCode=0 Mar 18 15:57:54 crc kubenswrapper[4696]: I0318 15:57:54.977933 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" event={"ID":"0ce74221-62af-412b-89b0-c0c77c52a866","Type":"ContainerDied","Data":"70ca343ae08649a3a352c24d3ffd4c61548152247d67bac2b2db5a7b44c16690"} Mar 18 15:57:54 crc kubenswrapper[4696]: I0318 15:57:54.977980 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" event={"ID":"0ce74221-62af-412b-89b0-c0c77c52a866","Type":"ContainerDied","Data":"af28a161c04b98d94857fa91652fbf7aad564fc4074212a49920b2c9adad6063"} Mar 18 15:57:54 crc kubenswrapper[4696]: I0318 15:57:54.978006 4696 scope.go:117] "RemoveContainer" containerID="70ca343ae08649a3a352c24d3ffd4c61548152247d67bac2b2db5a7b44c16690" Mar 18 15:57:54 crc kubenswrapper[4696]: I0318 15:57:54.978224 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-bsk8g" Mar 18 15:57:54 crc kubenswrapper[4696]: I0318 15:57:54.989224 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="bfea904e-cdd4-44dd-9302-821e35f9ae4b" containerName="cinder-scheduler" containerID="cri-o://e07f90e917ed403a021d179eb321ac1d826b01891d8089ca3dee69376f8a5152" gracePeriod=30 Mar 18 15:57:54 crc kubenswrapper[4696]: I0318 15:57:54.989407 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="bfea904e-cdd4-44dd-9302-821e35f9ae4b" containerName="probe" containerID="cri-o://2023f016fe3002f77300c5ff55c43c244bfc9f82232eefe6b2109c2bc08e8b6b" gracePeriod=30 Mar 18 15:57:55 crc kubenswrapper[4696]: I0318 15:57:55.036778 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-dns-svc\") pod \"0ce74221-62af-412b-89b0-c0c77c52a866\" (UID: \"0ce74221-62af-412b-89b0-c0c77c52a866\") " Mar 18 15:57:55 crc kubenswrapper[4696]: I0318 15:57:55.036855 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q59lk\" (UniqueName: \"kubernetes.io/projected/0ce74221-62af-412b-89b0-c0c77c52a866-kube-api-access-q59lk\") pod \"0ce74221-62af-412b-89b0-c0c77c52a866\" (UID: \"0ce74221-62af-412b-89b0-c0c77c52a866\") " Mar 18 15:57:55 crc kubenswrapper[4696]: I0318 15:57:55.036989 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-ovsdbserver-sb\") pod \"0ce74221-62af-412b-89b0-c0c77c52a866\" (UID: \"0ce74221-62af-412b-89b0-c0c77c52a866\") " Mar 18 15:57:55 crc kubenswrapper[4696]: I0318 15:57:55.037130 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-config\") pod \"0ce74221-62af-412b-89b0-c0c77c52a866\" (UID: \"0ce74221-62af-412b-89b0-c0c77c52a866\") " Mar 18 15:57:55 crc kubenswrapper[4696]: I0318 15:57:55.037297 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-dns-swift-storage-0\") pod \"0ce74221-62af-412b-89b0-c0c77c52a866\" (UID: \"0ce74221-62af-412b-89b0-c0c77c52a866\") " Mar 18 15:57:55 crc kubenswrapper[4696]: I0318 15:57:55.037350 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-ovsdbserver-nb\") pod \"0ce74221-62af-412b-89b0-c0c77c52a866\" (UID: \"0ce74221-62af-412b-89b0-c0c77c52a866\") " Mar 18 15:57:55 crc kubenswrapper[4696]: I0318 15:57:55.044363 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ce74221-62af-412b-89b0-c0c77c52a866-kube-api-access-q59lk" (OuterVolumeSpecName: "kube-api-access-q59lk") pod "0ce74221-62af-412b-89b0-c0c77c52a866" (UID: "0ce74221-62af-412b-89b0-c0c77c52a866"). InnerVolumeSpecName "kube-api-access-q59lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:55 crc kubenswrapper[4696]: I0318 15:57:55.116799 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0ce74221-62af-412b-89b0-c0c77c52a866" (UID: "0ce74221-62af-412b-89b0-c0c77c52a866"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:55 crc kubenswrapper[4696]: I0318 15:57:55.116972 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0ce74221-62af-412b-89b0-c0c77c52a866" (UID: "0ce74221-62af-412b-89b0-c0c77c52a866"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:55 crc kubenswrapper[4696]: I0318 15:57:55.118196 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0ce74221-62af-412b-89b0-c0c77c52a866" (UID: "0ce74221-62af-412b-89b0-c0c77c52a866"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:55 crc kubenswrapper[4696]: I0318 15:57:55.122218 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-config" (OuterVolumeSpecName: "config") pod "0ce74221-62af-412b-89b0-c0c77c52a866" (UID: "0ce74221-62af-412b-89b0-c0c77c52a866"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:55 crc kubenswrapper[4696]: I0318 15:57:55.125357 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0ce74221-62af-412b-89b0-c0c77c52a866" (UID: "0ce74221-62af-412b-89b0-c0c77c52a866"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:55 crc kubenswrapper[4696]: I0318 15:57:55.139825 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:55 crc kubenswrapper[4696]: I0318 15:57:55.139851 4696 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:55 crc kubenswrapper[4696]: I0318 15:57:55.139862 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:55 crc kubenswrapper[4696]: I0318 15:57:55.139872 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q59lk\" (UniqueName: \"kubernetes.io/projected/0ce74221-62af-412b-89b0-c0c77c52a866-kube-api-access-q59lk\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:55 crc kubenswrapper[4696]: I0318 15:57:55.139881 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:55 crc kubenswrapper[4696]: I0318 15:57:55.139890 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ce74221-62af-412b-89b0-c0c77c52a866-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:55 crc kubenswrapper[4696]: I0318 15:57:55.314309 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-bsk8g"] Mar 18 15:57:55 crc kubenswrapper[4696]: I0318 15:57:55.323863 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-bsk8g"] Mar 18 15:57:55 crc kubenswrapper[4696]: I0318 15:57:55.617965 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="086044f6-b566-472a-b30c-f710c801e907" path="/var/lib/kubelet/pods/086044f6-b566-472a-b30c-f710c801e907/volumes" Mar 18 15:57:55 crc kubenswrapper[4696]: I0318 15:57:55.619006 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ce74221-62af-412b-89b0-c0c77c52a866" path="/var/lib/kubelet/pods/0ce74221-62af-412b-89b0-c0c77c52a866/volumes" Mar 18 15:57:55 crc kubenswrapper[4696]: I0318 15:57:55.620267 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ca71182-1195-450d-a81b-b6db4dff526e" path="/var/lib/kubelet/pods/2ca71182-1195-450d-a81b-b6db4dff526e/volumes" Mar 18 15:57:57 crc kubenswrapper[4696]: I0318 15:57:57.015984 4696 generic.go:334] "Generic (PLEG): container finished" podID="bfea904e-cdd4-44dd-9302-821e35f9ae4b" containerID="2023f016fe3002f77300c5ff55c43c244bfc9f82232eefe6b2109c2bc08e8b6b" exitCode=0 Mar 18 15:57:57 crc kubenswrapper[4696]: I0318 15:57:57.016075 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bfea904e-cdd4-44dd-9302-821e35f9ae4b","Type":"ContainerDied","Data":"2023f016fe3002f77300c5ff55c43c244bfc9f82232eefe6b2109c2bc08e8b6b"} Mar 18 15:57:58 crc kubenswrapper[4696]: I0318 15:57:58.758257 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:57:59 crc kubenswrapper[4696]: I0318 15:57:59.900880 4696 scope.go:117] "RemoveContainer" containerID="a2a1979bb752a3a3f9f0b555210646feae5b0a15e560f437b6cb6b60163a65f8" Mar 18 15:57:59 crc kubenswrapper[4696]: I0318 15:57:59.972225 4696 scope.go:117] "RemoveContainer" containerID="70ca343ae08649a3a352c24d3ffd4c61548152247d67bac2b2db5a7b44c16690" Mar 18 15:57:59 crc kubenswrapper[4696]: E0318 15:57:59.979770 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70ca343ae08649a3a352c24d3ffd4c61548152247d67bac2b2db5a7b44c16690\": container with ID starting with 70ca343ae08649a3a352c24d3ffd4c61548152247d67bac2b2db5a7b44c16690 not found: ID does not exist" containerID="70ca343ae08649a3a352c24d3ffd4c61548152247d67bac2b2db5a7b44c16690" Mar 18 15:57:59 crc kubenswrapper[4696]: I0318 15:57:59.979830 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70ca343ae08649a3a352c24d3ffd4c61548152247d67bac2b2db5a7b44c16690"} err="failed to get container status \"70ca343ae08649a3a352c24d3ffd4c61548152247d67bac2b2db5a7b44c16690\": rpc error: code = NotFound desc = could not find container \"70ca343ae08649a3a352c24d3ffd4c61548152247d67bac2b2db5a7b44c16690\": container with ID starting with 70ca343ae08649a3a352c24d3ffd4c61548152247d67bac2b2db5a7b44c16690 not found: ID does not exist" Mar 18 15:57:59 crc kubenswrapper[4696]: I0318 15:57:59.979874 4696 scope.go:117] "RemoveContainer" containerID="a2a1979bb752a3a3f9f0b555210646feae5b0a15e560f437b6cb6b60163a65f8" Mar 18 15:57:59 crc kubenswrapper[4696]: E0318 15:57:59.984997 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2a1979bb752a3a3f9f0b555210646feae5b0a15e560f437b6cb6b60163a65f8\": container with ID starting with a2a1979bb752a3a3f9f0b555210646feae5b0a15e560f437b6cb6b60163a65f8 not found: ID does not exist" containerID="a2a1979bb752a3a3f9f0b555210646feae5b0a15e560f437b6cb6b60163a65f8" Mar 18 15:57:59 crc kubenswrapper[4696]: I0318 15:57:59.985056 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2a1979bb752a3a3f9f0b555210646feae5b0a15e560f437b6cb6b60163a65f8"} err="failed to get container status \"a2a1979bb752a3a3f9f0b555210646feae5b0a15e560f437b6cb6b60163a65f8\": rpc error: code = NotFound desc = could not find container \"a2a1979bb752a3a3f9f0b555210646feae5b0a15e560f437b6cb6b60163a65f8\": container with ID starting with a2a1979bb752a3a3f9f0b555210646feae5b0a15e560f437b6cb6b60163a65f8 not found: ID does not exist" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.202573 4696 generic.go:334] "Generic (PLEG): container finished" podID="bfea904e-cdd4-44dd-9302-821e35f9ae4b" containerID="e07f90e917ed403a021d179eb321ac1d826b01891d8089ca3dee69376f8a5152" exitCode=0 Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.202635 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bfea904e-cdd4-44dd-9302-821e35f9ae4b","Type":"ContainerDied","Data":"e07f90e917ed403a021d179eb321ac1d826b01891d8089ca3dee69376f8a5152"} Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.205746 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.207998 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564158-srm4f"] Mar 18 15:58:00 crc kubenswrapper[4696]: E0318 15:58:00.209635 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce74221-62af-412b-89b0-c0c77c52a866" containerName="init" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.209670 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce74221-62af-412b-89b0-c0c77c52a866" containerName="init" Mar 18 15:58:00 crc kubenswrapper[4696]: E0318 15:58:00.209688 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca71182-1195-450d-a81b-b6db4dff526e" containerName="neutron-api" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.209722 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca71182-1195-450d-a81b-b6db4dff526e" containerName="neutron-api" Mar 18 15:58:00 crc kubenswrapper[4696]: E0318 15:58:00.210097 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="086044f6-b566-472a-b30c-f710c801e907" containerName="barbican-api-log" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.210142 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="086044f6-b566-472a-b30c-f710c801e907" containerName="barbican-api-log" Mar 18 15:58:00 crc kubenswrapper[4696]: E0318 15:58:00.210161 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="086044f6-b566-472a-b30c-f710c801e907" containerName="barbican-api" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.210224 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="086044f6-b566-472a-b30c-f710c801e907" containerName="barbican-api" Mar 18 15:58:00 crc kubenswrapper[4696]: E0318 15:58:00.210261 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfea904e-cdd4-44dd-9302-821e35f9ae4b" containerName="probe" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.210270 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfea904e-cdd4-44dd-9302-821e35f9ae4b" containerName="probe" Mar 18 15:58:00 crc kubenswrapper[4696]: E0318 15:58:00.210285 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca71182-1195-450d-a81b-b6db4dff526e" containerName="neutron-httpd" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.210293 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca71182-1195-450d-a81b-b6db4dff526e" containerName="neutron-httpd" Mar 18 15:58:00 crc kubenswrapper[4696]: E0318 15:58:00.210305 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce74221-62af-412b-89b0-c0c77c52a866" containerName="dnsmasq-dns" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.210313 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce74221-62af-412b-89b0-c0c77c52a866" containerName="dnsmasq-dns" Mar 18 15:58:00 crc kubenswrapper[4696]: E0318 15:58:00.210344 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfea904e-cdd4-44dd-9302-821e35f9ae4b" containerName="cinder-scheduler" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.210352 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfea904e-cdd4-44dd-9302-821e35f9ae4b" containerName="cinder-scheduler" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.210650 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca71182-1195-450d-a81b-b6db4dff526e" containerName="neutron-httpd" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.210668 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca71182-1195-450d-a81b-b6db4dff526e" containerName="neutron-api" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.210690 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="086044f6-b566-472a-b30c-f710c801e907" containerName="barbican-api" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.210705 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="086044f6-b566-472a-b30c-f710c801e907" containerName="barbican-api-log" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.210716 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ce74221-62af-412b-89b0-c0c77c52a866" containerName="dnsmasq-dns" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.210729 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfea904e-cdd4-44dd-9302-821e35f9ae4b" containerName="cinder-scheduler" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.210742 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfea904e-cdd4-44dd-9302-821e35f9ae4b" containerName="probe" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.211787 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564158-srm4f" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.215772 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.216066 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.216244 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.285036 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfea904e-cdd4-44dd-9302-821e35f9ae4b-etc-machine-id\") pod \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\" (UID: \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\") " Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.285484 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfea904e-cdd4-44dd-9302-821e35f9ae4b-combined-ca-bundle\") pod \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\" (UID: \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\") " Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.285622 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfea904e-cdd4-44dd-9302-821e35f9ae4b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bfea904e-cdd4-44dd-9302-821e35f9ae4b" (UID: "bfea904e-cdd4-44dd-9302-821e35f9ae4b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.287456 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbwk2\" (UniqueName: \"kubernetes.io/projected/bfea904e-cdd4-44dd-9302-821e35f9ae4b-kube-api-access-dbwk2\") pod \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\" (UID: \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\") " Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.287698 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfea904e-cdd4-44dd-9302-821e35f9ae4b-scripts\") pod \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\" (UID: \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\") " Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.287875 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfea904e-cdd4-44dd-9302-821e35f9ae4b-config-data-custom\") pod \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\" (UID: \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\") " Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.287982 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfea904e-cdd4-44dd-9302-821e35f9ae4b-config-data\") pod \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\" (UID: \"bfea904e-cdd4-44dd-9302-821e35f9ae4b\") " Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.288334 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47fjn\" (UniqueName: \"kubernetes.io/projected/461ec5b2-5f78-4e16-b815-120c9181c0d5-kube-api-access-47fjn\") pod \"auto-csr-approver-29564158-srm4f\" (UID: \"461ec5b2-5f78-4e16-b815-120c9181c0d5\") " pod="openshift-infra/auto-csr-approver-29564158-srm4f" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.288607 4696 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfea904e-cdd4-44dd-9302-821e35f9ae4b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.303622 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564158-srm4f"] Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.308102 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfea904e-cdd4-44dd-9302-821e35f9ae4b-kube-api-access-dbwk2" (OuterVolumeSpecName: "kube-api-access-dbwk2") pod "bfea904e-cdd4-44dd-9302-821e35f9ae4b" (UID: "bfea904e-cdd4-44dd-9302-821e35f9ae4b"). InnerVolumeSpecName "kube-api-access-dbwk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.308261 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfea904e-cdd4-44dd-9302-821e35f9ae4b-scripts" (OuterVolumeSpecName: "scripts") pod "bfea904e-cdd4-44dd-9302-821e35f9ae4b" (UID: "bfea904e-cdd4-44dd-9302-821e35f9ae4b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.323791 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfea904e-cdd4-44dd-9302-821e35f9ae4b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bfea904e-cdd4-44dd-9302-821e35f9ae4b" (UID: "bfea904e-cdd4-44dd-9302-821e35f9ae4b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.391351 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47fjn\" (UniqueName: \"kubernetes.io/projected/461ec5b2-5f78-4e16-b815-120c9181c0d5-kube-api-access-47fjn\") pod \"auto-csr-approver-29564158-srm4f\" (UID: \"461ec5b2-5f78-4e16-b815-120c9181c0d5\") " pod="openshift-infra/auto-csr-approver-29564158-srm4f" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.391468 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfea904e-cdd4-44dd-9302-821e35f9ae4b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.391482 4696 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfea904e-cdd4-44dd-9302-821e35f9ae4b-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.391497 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbwk2\" (UniqueName: \"kubernetes.io/projected/bfea904e-cdd4-44dd-9302-821e35f9ae4b-kube-api-access-dbwk2\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.414637 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47fjn\" (UniqueName: \"kubernetes.io/projected/461ec5b2-5f78-4e16-b815-120c9181c0d5-kube-api-access-47fjn\") pod \"auto-csr-approver-29564158-srm4f\" (UID: \"461ec5b2-5f78-4e16-b815-120c9181c0d5\") " pod="openshift-infra/auto-csr-approver-29564158-srm4f" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.416730 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfea904e-cdd4-44dd-9302-821e35f9ae4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfea904e-cdd4-44dd-9302-821e35f9ae4b" (UID: "bfea904e-cdd4-44dd-9302-821e35f9ae4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.466872 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfea904e-cdd4-44dd-9302-821e35f9ae4b-config-data" (OuterVolumeSpecName: "config-data") pod "bfea904e-cdd4-44dd-9302-821e35f9ae4b" (UID: "bfea904e-cdd4-44dd-9302-821e35f9ae4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:00 crc kubenswrapper[4696]: E0318 15:58:00.480976 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="6e295c1a-9787-42a3-ac9c-5252bda652b5" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.537003 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfea904e-cdd4-44dd-9302-821e35f9ae4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.537052 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfea904e-cdd4-44dd-9302-821e35f9ae4b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:00 crc kubenswrapper[4696]: I0318 15:58:00.581993 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564158-srm4f" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.174659 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-59764c649b-dxxpb" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.219935 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564158-srm4f"] Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.229441 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bfea904e-cdd4-44dd-9302-821e35f9ae4b","Type":"ContainerDied","Data":"6c35031aca774df67f1b0900ec6df679547841ff0abf5239b3354e08c4cb74da"} Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.229504 4696 scope.go:117] "RemoveContainer" containerID="2023f016fe3002f77300c5ff55c43c244bfc9f82232eefe6b2109c2bc08e8b6b" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.231103 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.265815 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e295c1a-9787-42a3-ac9c-5252bda652b5","Type":"ContainerStarted","Data":"fb555445370e4aa80dadc7770577692450790261e03afe38f4da271c577ff3b2"} Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.266050 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.266066 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e295c1a-9787-42a3-ac9c-5252bda652b5" containerName="ceilometer-notification-agent" containerID="cri-o://7a723587643d21a0b67f5b2a09f847f0f2f128465d455ccdbee4d6d19cdcaea8" gracePeriod=30 Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.266270 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e295c1a-9787-42a3-ac9c-5252bda652b5" containerName="proxy-httpd" containerID="cri-o://fb555445370e4aa80dadc7770577692450790261e03afe38f4da271c577ff3b2" gracePeriod=30 Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.266327 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e295c1a-9787-42a3-ac9c-5252bda652b5" containerName="sg-core" containerID="cri-o://151d0a071f9dc6476e0215d4b6a101e857bdb3953ef5da2933a196891fdd7548" gracePeriod=30 Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.288120 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-696476876d-4rxz2"] Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.290008 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-696476876d-4rxz2" podUID="72021b21-00cf-4c33-be2d-b24f20dc0f9f" containerName="horizon" containerID="cri-o://aa6ce0736d260c0e890babe3e97927ddf6d6dc7f76b06938d567fa9e56ca3420" gracePeriod=30 Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.290280 4696 scope.go:117] "RemoveContainer" containerID="e07f90e917ed403a021d179eb321ac1d826b01891d8089ca3dee69376f8a5152" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.289954 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-696476876d-4rxz2" podUID="72021b21-00cf-4c33-be2d-b24f20dc0f9f" containerName="horizon-log" containerID="cri-o://edfbd6cd9033e70288358ec7f95bf3e894b8c09d6b8a9796e6f6a76bce599b5d" gracePeriod=30 Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.406627 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.447458 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.488325 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.490539 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.495351 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.495803 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.571039 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2ee43b8-090b-4daf-907b-9a21c3986e42-scripts\") pod \"cinder-scheduler-0\" (UID: \"e2ee43b8-090b-4daf-907b-9a21c3986e42\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.571134 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2ee43b8-090b-4daf-907b-9a21c3986e42-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e2ee43b8-090b-4daf-907b-9a21c3986e42\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.571238 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrvkt\" (UniqueName: \"kubernetes.io/projected/e2ee43b8-090b-4daf-907b-9a21c3986e42-kube-api-access-hrvkt\") pod \"cinder-scheduler-0\" (UID: \"e2ee43b8-090b-4daf-907b-9a21c3986e42\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.571274 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2ee43b8-090b-4daf-907b-9a21c3986e42-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e2ee43b8-090b-4daf-907b-9a21c3986e42\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.571306 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2ee43b8-090b-4daf-907b-9a21c3986e42-config-data\") pod \"cinder-scheduler-0\" (UID: \"e2ee43b8-090b-4daf-907b-9a21c3986e42\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.571321 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2ee43b8-090b-4daf-907b-9a21c3986e42-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e2ee43b8-090b-4daf-907b-9a21c3986e42\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.587259 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.591836 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-9749b5588-6wsv8" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.612732 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfea904e-cdd4-44dd-9302-821e35f9ae4b" path="/var/lib/kubelet/pods/bfea904e-cdd4-44dd-9302-821e35f9ae4b/volumes" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.673767 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrvkt\" (UniqueName: \"kubernetes.io/projected/e2ee43b8-090b-4daf-907b-9a21c3986e42-kube-api-access-hrvkt\") pod \"cinder-scheduler-0\" (UID: \"e2ee43b8-090b-4daf-907b-9a21c3986e42\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.673858 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2ee43b8-090b-4daf-907b-9a21c3986e42-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e2ee43b8-090b-4daf-907b-9a21c3986e42\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.673894 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2ee43b8-090b-4daf-907b-9a21c3986e42-config-data\") pod \"cinder-scheduler-0\" (UID: \"e2ee43b8-090b-4daf-907b-9a21c3986e42\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.673915 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2ee43b8-090b-4daf-907b-9a21c3986e42-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e2ee43b8-090b-4daf-907b-9a21c3986e42\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.673954 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2ee43b8-090b-4daf-907b-9a21c3986e42-scripts\") pod \"cinder-scheduler-0\" (UID: \"e2ee43b8-090b-4daf-907b-9a21c3986e42\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.673998 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2ee43b8-090b-4daf-907b-9a21c3986e42-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e2ee43b8-090b-4daf-907b-9a21c3986e42\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.674798 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2ee43b8-090b-4daf-907b-9a21c3986e42-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e2ee43b8-090b-4daf-907b-9a21c3986e42\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.694749 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2ee43b8-090b-4daf-907b-9a21c3986e42-scripts\") pod \"cinder-scheduler-0\" (UID: \"e2ee43b8-090b-4daf-907b-9a21c3986e42\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.695277 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2ee43b8-090b-4daf-907b-9a21c3986e42-config-data\") pod \"cinder-scheduler-0\" (UID: \"e2ee43b8-090b-4daf-907b-9a21c3986e42\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.695379 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2ee43b8-090b-4daf-907b-9a21c3986e42-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e2ee43b8-090b-4daf-907b-9a21c3986e42\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.702879 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.705039 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-f47bf59fd-ck79s"] Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.711312 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2ee43b8-090b-4daf-907b-9a21c3986e42-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e2ee43b8-090b-4daf-907b-9a21c3986e42\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.720321 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrvkt\" (UniqueName: \"kubernetes.io/projected/e2ee43b8-090b-4daf-907b-9a21c3986e42-kube-api-access-hrvkt\") pod \"cinder-scheduler-0\" (UID: \"e2ee43b8-090b-4daf-907b-9a21c3986e42\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:01 crc kubenswrapper[4696]: I0318 15:58:01.858479 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 15:58:02 crc kubenswrapper[4696]: I0318 15:58:02.202565 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:58:02 crc kubenswrapper[4696]: I0318 15:58:02.322495 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564158-srm4f" event={"ID":"461ec5b2-5f78-4e16-b815-120c9181c0d5","Type":"ContainerStarted","Data":"2260a4d8fa413ea5c28daa5a77f79f913a52839361e0521d1a72f3ff50fe8d0a"} Mar 18 15:58:02 crc kubenswrapper[4696]: I0318 15:58:02.330962 4696 generic.go:334] "Generic (PLEG): container finished" podID="6e295c1a-9787-42a3-ac9c-5252bda652b5" containerID="fb555445370e4aa80dadc7770577692450790261e03afe38f4da271c577ff3b2" exitCode=0 Mar 18 15:58:02 crc kubenswrapper[4696]: I0318 15:58:02.331007 4696 generic.go:334] "Generic (PLEG): container finished" podID="6e295c1a-9787-42a3-ac9c-5252bda652b5" containerID="151d0a071f9dc6476e0215d4b6a101e857bdb3953ef5da2933a196891fdd7548" exitCode=2 Mar 18 15:58:02 crc kubenswrapper[4696]: I0318 15:58:02.337705 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e295c1a-9787-42a3-ac9c-5252bda652b5","Type":"ContainerDied","Data":"fb555445370e4aa80dadc7770577692450790261e03afe38f4da271c577ff3b2"} Mar 18 15:58:02 crc kubenswrapper[4696]: I0318 15:58:02.337798 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e295c1a-9787-42a3-ac9c-5252bda652b5","Type":"ContainerDied","Data":"151d0a071f9dc6476e0215d4b6a101e857bdb3953ef5da2933a196891fdd7548"} Mar 18 15:58:02 crc kubenswrapper[4696]: I0318 15:58:02.338122 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-f47bf59fd-ck79s" podUID="44b79cc9-0ed9-4f84-a352-fd6ba6d28daf" containerName="placement-log" containerID="cri-o://7f41d9f0a8d6c8cb412fd5cc67eacd4fd815324b30cb36060c28fe3198315f6f" gracePeriod=30 Mar 18 15:58:02 crc kubenswrapper[4696]: I0318 15:58:02.339251 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-f47bf59fd-ck79s" podUID="44b79cc9-0ed9-4f84-a352-fd6ba6d28daf" containerName="placement-api" containerID="cri-o://cf3d2d8cfcb8e553e7e797548c7da1178fcd4fbb4098da2325fbd44f65acf1c4" gracePeriod=30 Mar 18 15:58:02 crc kubenswrapper[4696]: I0318 15:58:02.420827 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 15:58:03 crc kubenswrapper[4696]: I0318 15:58:03.319732 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 15:58:03 crc kubenswrapper[4696]: I0318 15:58:03.402964 4696 generic.go:334] "Generic (PLEG): container finished" podID="44b79cc9-0ed9-4f84-a352-fd6ba6d28daf" containerID="7f41d9f0a8d6c8cb412fd5cc67eacd4fd815324b30cb36060c28fe3198315f6f" exitCode=143 Mar 18 15:58:03 crc kubenswrapper[4696]: I0318 15:58:03.403395 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f47bf59fd-ck79s" event={"ID":"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf","Type":"ContainerDied","Data":"7f41d9f0a8d6c8cb412fd5cc67eacd4fd815324b30cb36060c28fe3198315f6f"} Mar 18 15:58:03 crc kubenswrapper[4696]: I0318 15:58:03.415546 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564158-srm4f" event={"ID":"461ec5b2-5f78-4e16-b815-120c9181c0d5","Type":"ContainerStarted","Data":"4b9356aa13581adef14af62b6776b37bd2c42ddc713862187e0a3b3f219ab5ad"} Mar 18 15:58:03 crc kubenswrapper[4696]: I0318 15:58:03.419203 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e2ee43b8-090b-4daf-907b-9a21c3986e42","Type":"ContainerStarted","Data":"9b6cf72220861762346b6cb01d8ec73eac7fcb42b742959d0422485752a6ec2b"} Mar 18 15:58:03 crc kubenswrapper[4696]: I0318 15:58:03.419237 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e2ee43b8-090b-4daf-907b-9a21c3986e42","Type":"ContainerStarted","Data":"e16204625128b4f831b1924717a7c7ff82afdc4cc179b1c537aa59bf6b04ee50"} Mar 18 15:58:03 crc kubenswrapper[4696]: I0318 15:58:03.419314 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-b5955bfd6-zmfrz" Mar 18 15:58:03 crc kubenswrapper[4696]: I0318 15:58:03.444963 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564158-srm4f" podStartSLOduration=2.497918586 podStartE2EDuration="3.444937335s" podCreationTimestamp="2026-03-18 15:58:00 +0000 UTC" firstStartedPulling="2026-03-18 15:58:01.225800601 +0000 UTC m=+1324.231974807" lastFinishedPulling="2026-03-18 15:58:02.17281935 +0000 UTC m=+1325.178993556" observedRunningTime="2026-03-18 15:58:03.434839072 +0000 UTC m=+1326.441013278" watchObservedRunningTime="2026-03-18 15:58:03.444937335 +0000 UTC m=+1326.451111541" Mar 18 15:58:04 crc kubenswrapper[4696]: I0318 15:58:04.432042 4696 generic.go:334] "Generic (PLEG): container finished" podID="461ec5b2-5f78-4e16-b815-120c9181c0d5" containerID="4b9356aa13581adef14af62b6776b37bd2c42ddc713862187e0a3b3f219ab5ad" exitCode=0 Mar 18 15:58:04 crc kubenswrapper[4696]: I0318 15:58:04.432111 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564158-srm4f" event={"ID":"461ec5b2-5f78-4e16-b815-120c9181c0d5","Type":"ContainerDied","Data":"4b9356aa13581adef14af62b6776b37bd2c42ddc713862187e0a3b3f219ab5ad"} Mar 18 15:58:04 crc kubenswrapper[4696]: I0318 15:58:04.436438 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e2ee43b8-090b-4daf-907b-9a21c3986e42","Type":"ContainerStarted","Data":"11129798cd79280e9b3a81fe20fc7a68d5b1d1667601b5f57d75d95bcc8600fe"} Mar 18 15:58:04 crc kubenswrapper[4696]: I0318 15:58:04.484918 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.484888723 podStartE2EDuration="3.484888723s" podCreationTimestamp="2026-03-18 15:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:58:04.47794799 +0000 UTC m=+1327.484122206" watchObservedRunningTime="2026-03-18 15:58:04.484888723 +0000 UTC m=+1327.491062929" Mar 18 15:58:05 crc kubenswrapper[4696]: I0318 15:58:05.495804 4696 generic.go:334] "Generic (PLEG): container finished" podID="6e295c1a-9787-42a3-ac9c-5252bda652b5" containerID="7a723587643d21a0b67f5b2a09f847f0f2f128465d455ccdbee4d6d19cdcaea8" exitCode=0 Mar 18 15:58:05 crc kubenswrapper[4696]: I0318 15:58:05.496509 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e295c1a-9787-42a3-ac9c-5252bda652b5","Type":"ContainerDied","Data":"7a723587643d21a0b67f5b2a09f847f0f2f128465d455ccdbee4d6d19cdcaea8"} Mar 18 15:58:05 crc kubenswrapper[4696]: I0318 15:58:05.500469 4696 generic.go:334] "Generic (PLEG): container finished" podID="72021b21-00cf-4c33-be2d-b24f20dc0f9f" containerID="aa6ce0736d260c0e890babe3e97927ddf6d6dc7f76b06938d567fa9e56ca3420" exitCode=0 Mar 18 15:58:05 crc kubenswrapper[4696]: I0318 15:58:05.500671 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-696476876d-4rxz2" event={"ID":"72021b21-00cf-4c33-be2d-b24f20dc0f9f","Type":"ContainerDied","Data":"aa6ce0736d260c0e890babe3e97927ddf6d6dc7f76b06938d567fa9e56ca3420"} Mar 18 15:58:05 crc kubenswrapper[4696]: I0318 15:58:05.894043 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.013886 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e295c1a-9787-42a3-ac9c-5252bda652b5-scripts\") pod \"6e295c1a-9787-42a3-ac9c-5252bda652b5\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.013958 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjzpm\" (UniqueName: \"kubernetes.io/projected/6e295c1a-9787-42a3-ac9c-5252bda652b5-kube-api-access-xjzpm\") pod \"6e295c1a-9787-42a3-ac9c-5252bda652b5\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.013992 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e295c1a-9787-42a3-ac9c-5252bda652b5-log-httpd\") pod \"6e295c1a-9787-42a3-ac9c-5252bda652b5\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.014252 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e295c1a-9787-42a3-ac9c-5252bda652b5-sg-core-conf-yaml\") pod \"6e295c1a-9787-42a3-ac9c-5252bda652b5\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.014343 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e295c1a-9787-42a3-ac9c-5252bda652b5-config-data\") pod \"6e295c1a-9787-42a3-ac9c-5252bda652b5\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.014410 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e295c1a-9787-42a3-ac9c-5252bda652b5-run-httpd\") pod \"6e295c1a-9787-42a3-ac9c-5252bda652b5\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.014443 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e295c1a-9787-42a3-ac9c-5252bda652b5-combined-ca-bundle\") pod \"6e295c1a-9787-42a3-ac9c-5252bda652b5\" (UID: \"6e295c1a-9787-42a3-ac9c-5252bda652b5\") " Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.016741 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e295c1a-9787-42a3-ac9c-5252bda652b5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6e295c1a-9787-42a3-ac9c-5252bda652b5" (UID: "6e295c1a-9787-42a3-ac9c-5252bda652b5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.018307 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e295c1a-9787-42a3-ac9c-5252bda652b5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6e295c1a-9787-42a3-ac9c-5252bda652b5" (UID: "6e295c1a-9787-42a3-ac9c-5252bda652b5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.049200 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e295c1a-9787-42a3-ac9c-5252bda652b5-scripts" (OuterVolumeSpecName: "scripts") pod "6e295c1a-9787-42a3-ac9c-5252bda652b5" (UID: "6e295c1a-9787-42a3-ac9c-5252bda652b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.049221 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e295c1a-9787-42a3-ac9c-5252bda652b5-kube-api-access-xjzpm" (OuterVolumeSpecName: "kube-api-access-xjzpm") pod "6e295c1a-9787-42a3-ac9c-5252bda652b5" (UID: "6e295c1a-9787-42a3-ac9c-5252bda652b5"). InnerVolumeSpecName "kube-api-access-xjzpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.087091 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564158-srm4f" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.097726 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e295c1a-9787-42a3-ac9c-5252bda652b5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6e295c1a-9787-42a3-ac9c-5252bda652b5" (UID: "6e295c1a-9787-42a3-ac9c-5252bda652b5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.116146 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47fjn\" (UniqueName: \"kubernetes.io/projected/461ec5b2-5f78-4e16-b815-120c9181c0d5-kube-api-access-47fjn\") pod \"461ec5b2-5f78-4e16-b815-120c9181c0d5\" (UID: \"461ec5b2-5f78-4e16-b815-120c9181c0d5\") " Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.116871 4696 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e295c1a-9787-42a3-ac9c-5252bda652b5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.116893 4696 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e295c1a-9787-42a3-ac9c-5252bda652b5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.116903 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e295c1a-9787-42a3-ac9c-5252bda652b5-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.116913 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjzpm\" (UniqueName: \"kubernetes.io/projected/6e295c1a-9787-42a3-ac9c-5252bda652b5-kube-api-access-xjzpm\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.116927 4696 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e295c1a-9787-42a3-ac9c-5252bda652b5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.120971 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e295c1a-9787-42a3-ac9c-5252bda652b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e295c1a-9787-42a3-ac9c-5252bda652b5" (UID: "6e295c1a-9787-42a3-ac9c-5252bda652b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.130812 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/461ec5b2-5f78-4e16-b815-120c9181c0d5-kube-api-access-47fjn" (OuterVolumeSpecName: "kube-api-access-47fjn") pod "461ec5b2-5f78-4e16-b815-120c9181c0d5" (UID: "461ec5b2-5f78-4e16-b815-120c9181c0d5"). InnerVolumeSpecName "kube-api-access-47fjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.160071 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.194874 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e295c1a-9787-42a3-ac9c-5252bda652b5-config-data" (OuterVolumeSpecName: "config-data") pod "6e295c1a-9787-42a3-ac9c-5252bda652b5" (UID: "6e295c1a-9787-42a3-ac9c-5252bda652b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.219126 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-logs\") pod \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.219223 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-public-tls-certs\") pod \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.219311 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-scripts\") pod \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.219366 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls8xd\" (UniqueName: \"kubernetes.io/projected/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-kube-api-access-ls8xd\") pod \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.219423 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-internal-tls-certs\") pod \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.219567 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-config-data\") pod \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.219634 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-combined-ca-bundle\") pod \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\" (UID: \"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf\") " Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.220165 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47fjn\" (UniqueName: \"kubernetes.io/projected/461ec5b2-5f78-4e16-b815-120c9181c0d5-kube-api-access-47fjn\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.220186 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e295c1a-9787-42a3-ac9c-5252bda652b5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.220169 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-logs" (OuterVolumeSpecName: "logs") pod "44b79cc9-0ed9-4f84-a352-fd6ba6d28daf" (UID: "44b79cc9-0ed9-4f84-a352-fd6ba6d28daf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.220197 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e295c1a-9787-42a3-ac9c-5252bda652b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.224799 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-kube-api-access-ls8xd" (OuterVolumeSpecName: "kube-api-access-ls8xd") pod "44b79cc9-0ed9-4f84-a352-fd6ba6d28daf" (UID: "44b79cc9-0ed9-4f84-a352-fd6ba6d28daf"). InnerVolumeSpecName "kube-api-access-ls8xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.236236 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-scripts" (OuterVolumeSpecName: "scripts") pod "44b79cc9-0ed9-4f84-a352-fd6ba6d28daf" (UID: "44b79cc9-0ed9-4f84-a352-fd6ba6d28daf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.272695 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 15:58:06 crc kubenswrapper[4696]: E0318 15:58:06.273254 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e295c1a-9787-42a3-ac9c-5252bda652b5" containerName="ceilometer-notification-agent" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.273280 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e295c1a-9787-42a3-ac9c-5252bda652b5" containerName="ceilometer-notification-agent" Mar 18 15:58:06 crc kubenswrapper[4696]: E0318 15:58:06.273304 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e295c1a-9787-42a3-ac9c-5252bda652b5" containerName="proxy-httpd" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.273316 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e295c1a-9787-42a3-ac9c-5252bda652b5" containerName="proxy-httpd" Mar 18 15:58:06 crc kubenswrapper[4696]: E0318 15:58:06.273335 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44b79cc9-0ed9-4f84-a352-fd6ba6d28daf" containerName="placement-api" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.273344 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="44b79cc9-0ed9-4f84-a352-fd6ba6d28daf" containerName="placement-api" Mar 18 15:58:06 crc kubenswrapper[4696]: E0318 15:58:06.273359 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="461ec5b2-5f78-4e16-b815-120c9181c0d5" containerName="oc" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.273366 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="461ec5b2-5f78-4e16-b815-120c9181c0d5" containerName="oc" Mar 18 15:58:06 crc kubenswrapper[4696]: E0318 15:58:06.273386 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44b79cc9-0ed9-4f84-a352-fd6ba6d28daf" containerName="placement-log" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.273394 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="44b79cc9-0ed9-4f84-a352-fd6ba6d28daf" containerName="placement-log" Mar 18 15:58:06 crc kubenswrapper[4696]: E0318 15:58:06.273408 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e295c1a-9787-42a3-ac9c-5252bda652b5" containerName="sg-core" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.273415 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e295c1a-9787-42a3-ac9c-5252bda652b5" containerName="sg-core" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.273659 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="44b79cc9-0ed9-4f84-a352-fd6ba6d28daf" containerName="placement-log" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.273683 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e295c1a-9787-42a3-ac9c-5252bda652b5" containerName="sg-core" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.273699 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e295c1a-9787-42a3-ac9c-5252bda652b5" containerName="ceilometer-notification-agent" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.273722 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e295c1a-9787-42a3-ac9c-5252bda652b5" containerName="proxy-httpd" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.273733 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="44b79cc9-0ed9-4f84-a352-fd6ba6d28daf" containerName="placement-api" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.273749 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="461ec5b2-5f78-4e16-b815-120c9181c0d5" containerName="oc" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.274419 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.278170 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-skcw7" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.278981 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.279705 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.291181 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.321649 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e-openstack-config-secret\") pod \"openstackclient\" (UID: \"d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e\") " pod="openstack/openstackclient" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.321752 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e\") " pod="openstack/openstackclient" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.321840 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg4nd\" (UniqueName: \"kubernetes.io/projected/d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e-kube-api-access-zg4nd\") pod \"openstackclient\" (UID: \"d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e\") " pod="openstack/openstackclient" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.321955 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e-openstack-config\") pod \"openstackclient\" (UID: \"d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e\") " pod="openstack/openstackclient" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.322056 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-logs\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.322069 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.322080 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls8xd\" (UniqueName: \"kubernetes.io/projected/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-kube-api-access-ls8xd\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.324888 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44b79cc9-0ed9-4f84-a352-fd6ba6d28daf" (UID: "44b79cc9-0ed9-4f84-a352-fd6ba6d28daf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.328838 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-config-data" (OuterVolumeSpecName: "config-data") pod "44b79cc9-0ed9-4f84-a352-fd6ba6d28daf" (UID: "44b79cc9-0ed9-4f84-a352-fd6ba6d28daf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.366610 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "44b79cc9-0ed9-4f84-a352-fd6ba6d28daf" (UID: "44b79cc9-0ed9-4f84-a352-fd6ba6d28daf"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.379062 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "44b79cc9-0ed9-4f84-a352-fd6ba6d28daf" (UID: "44b79cc9-0ed9-4f84-a352-fd6ba6d28daf"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.424532 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e\") " pod="openstack/openstackclient" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.425289 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg4nd\" (UniqueName: \"kubernetes.io/projected/d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e-kube-api-access-zg4nd\") pod \"openstackclient\" (UID: \"d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e\") " pod="openstack/openstackclient" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.425376 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e-openstack-config\") pod \"openstackclient\" (UID: \"d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e\") " pod="openstack/openstackclient" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.426328 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e-openstack-config\") pod \"openstackclient\" (UID: \"d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e\") " pod="openstack/openstackclient" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.426488 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e-openstack-config-secret\") pod \"openstackclient\" (UID: \"d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e\") " pod="openstack/openstackclient" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.427084 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.427106 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.427122 4696 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.427135 4696 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.431283 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e-openstack-config-secret\") pod \"openstackclient\" (UID: \"d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e\") " pod="openstack/openstackclient" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.433333 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e\") " pod="openstack/openstackclient" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.444204 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg4nd\" (UniqueName: \"kubernetes.io/projected/d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e-kube-api-access-zg4nd\") pod \"openstackclient\" (UID: \"d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e\") " pod="openstack/openstackclient" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.449739 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-696476876d-4rxz2" podUID="72021b21-00cf-4c33-be2d-b24f20dc0f9f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.516906 4696 generic.go:334] "Generic (PLEG): container finished" podID="44b79cc9-0ed9-4f84-a352-fd6ba6d28daf" containerID="cf3d2d8cfcb8e553e7e797548c7da1178fcd4fbb4098da2325fbd44f65acf1c4" exitCode=0 Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.516988 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f47bf59fd-ck79s" event={"ID":"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf","Type":"ContainerDied","Data":"cf3d2d8cfcb8e553e7e797548c7da1178fcd4fbb4098da2325fbd44f65acf1c4"} Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.517024 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f47bf59fd-ck79s" event={"ID":"44b79cc9-0ed9-4f84-a352-fd6ba6d28daf","Type":"ContainerDied","Data":"9cb11409ee1f87729f96978da64e9fd742a05a957945d74b4568301269bbe9d2"} Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.517046 4696 scope.go:117] "RemoveContainer" containerID="cf3d2d8cfcb8e553e7e797548c7da1178fcd4fbb4098da2325fbd44f65acf1c4" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.517412 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f47bf59fd-ck79s" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.543144 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e295c1a-9787-42a3-ac9c-5252bda652b5","Type":"ContainerDied","Data":"5a19e230e0245bb36d4544140a0c3346de38f6d54e6eff72ba8448086f555bb7"} Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.543203 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.547154 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564158-srm4f" event={"ID":"461ec5b2-5f78-4e16-b815-120c9181c0d5","Type":"ContainerDied","Data":"2260a4d8fa413ea5c28daa5a77f79f913a52839361e0521d1a72f3ff50fe8d0a"} Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.547222 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2260a4d8fa413ea5c28daa5a77f79f913a52839361e0521d1a72f3ff50fe8d0a" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.547289 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564158-srm4f" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.558689 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564152-55vm6"] Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.569085 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564152-55vm6"] Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.579548 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-f47bf59fd-ck79s"] Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.590666 4696 scope.go:117] "RemoveContainer" containerID="7f41d9f0a8d6c8cb412fd5cc67eacd4fd815324b30cb36060c28fe3198315f6f" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.598943 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-f47bf59fd-ck79s"] Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.603176 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.697298 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.703144 4696 scope.go:117] "RemoveContainer" containerID="cf3d2d8cfcb8e553e7e797548c7da1178fcd4fbb4098da2325fbd44f65acf1c4" Mar 18 15:58:06 crc kubenswrapper[4696]: E0318 15:58:06.708436 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf3d2d8cfcb8e553e7e797548c7da1178fcd4fbb4098da2325fbd44f65acf1c4\": container with ID starting with cf3d2d8cfcb8e553e7e797548c7da1178fcd4fbb4098da2325fbd44f65acf1c4 not found: ID does not exist" containerID="cf3d2d8cfcb8e553e7e797548c7da1178fcd4fbb4098da2325fbd44f65acf1c4" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.708503 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf3d2d8cfcb8e553e7e797548c7da1178fcd4fbb4098da2325fbd44f65acf1c4"} err="failed to get container status \"cf3d2d8cfcb8e553e7e797548c7da1178fcd4fbb4098da2325fbd44f65acf1c4\": rpc error: code = NotFound desc = could not find container \"cf3d2d8cfcb8e553e7e797548c7da1178fcd4fbb4098da2325fbd44f65acf1c4\": container with ID starting with cf3d2d8cfcb8e553e7e797548c7da1178fcd4fbb4098da2325fbd44f65acf1c4 not found: ID does not exist" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.708560 4696 scope.go:117] "RemoveContainer" containerID="7f41d9f0a8d6c8cb412fd5cc67eacd4fd815324b30cb36060c28fe3198315f6f" Mar 18 15:58:06 crc kubenswrapper[4696]: E0318 15:58:06.709085 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f41d9f0a8d6c8cb412fd5cc67eacd4fd815324b30cb36060c28fe3198315f6f\": container with ID starting with 7f41d9f0a8d6c8cb412fd5cc67eacd4fd815324b30cb36060c28fe3198315f6f not found: ID does not exist" containerID="7f41d9f0a8d6c8cb412fd5cc67eacd4fd815324b30cb36060c28fe3198315f6f" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.709118 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f41d9f0a8d6c8cb412fd5cc67eacd4fd815324b30cb36060c28fe3198315f6f"} err="failed to get container status \"7f41d9f0a8d6c8cb412fd5cc67eacd4fd815324b30cb36060c28fe3198315f6f\": rpc error: code = NotFound desc = could not find container \"7f41d9f0a8d6c8cb412fd5cc67eacd4fd815324b30cb36060c28fe3198315f6f\": container with ID starting with 7f41d9f0a8d6c8cb412fd5cc67eacd4fd815324b30cb36060c28fe3198315f6f not found: ID does not exist" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.709137 4696 scope.go:117] "RemoveContainer" containerID="fb555445370e4aa80dadc7770577692450790261e03afe38f4da271c577ff3b2" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.720438 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.739023 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.742150 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.745301 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.745642 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.785409 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.801706 4696 scope.go:117] "RemoveContainer" containerID="151d0a071f9dc6476e0215d4b6a101e857bdb3953ef5da2933a196891fdd7548" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.862461 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.877408 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkwwx\" (UniqueName: \"kubernetes.io/projected/4c585254-06f5-4cf1-bc73-27d3b04795f0-kube-api-access-tkwwx\") pod \"ceilometer-0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " pod="openstack/ceilometer-0" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.877467 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " pod="openstack/ceilometer-0" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.877497 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-scripts\") pod \"ceilometer-0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " pod="openstack/ceilometer-0" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.877540 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-config-data\") pod \"ceilometer-0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " pod="openstack/ceilometer-0" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.877580 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " pod="openstack/ceilometer-0" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.877802 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c585254-06f5-4cf1-bc73-27d3b04795f0-run-httpd\") pod \"ceilometer-0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " pod="openstack/ceilometer-0" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.877831 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c585254-06f5-4cf1-bc73-27d3b04795f0-log-httpd\") pod \"ceilometer-0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " pod="openstack/ceilometer-0" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.966446 4696 scope.go:117] "RemoveContainer" containerID="7a723587643d21a0b67f5b2a09f847f0f2f128465d455ccdbee4d6d19cdcaea8" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.979932 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkwwx\" (UniqueName: \"kubernetes.io/projected/4c585254-06f5-4cf1-bc73-27d3b04795f0-kube-api-access-tkwwx\") pod \"ceilometer-0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " pod="openstack/ceilometer-0" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.980007 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " pod="openstack/ceilometer-0" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.980037 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-scripts\") pod \"ceilometer-0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " pod="openstack/ceilometer-0" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.980073 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-config-data\") pod \"ceilometer-0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " pod="openstack/ceilometer-0" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.980107 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " pod="openstack/ceilometer-0" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.980149 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c585254-06f5-4cf1-bc73-27d3b04795f0-run-httpd\") pod \"ceilometer-0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " pod="openstack/ceilometer-0" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.980172 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c585254-06f5-4cf1-bc73-27d3b04795f0-log-httpd\") pod \"ceilometer-0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " pod="openstack/ceilometer-0" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.982311 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c585254-06f5-4cf1-bc73-27d3b04795f0-run-httpd\") pod \"ceilometer-0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " pod="openstack/ceilometer-0" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.982932 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c585254-06f5-4cf1-bc73-27d3b04795f0-log-httpd\") pod \"ceilometer-0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " pod="openstack/ceilometer-0" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.987679 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-config-data\") pod \"ceilometer-0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " pod="openstack/ceilometer-0" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.987876 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " pod="openstack/ceilometer-0" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.992141 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " pod="openstack/ceilometer-0" Mar 18 15:58:06 crc kubenswrapper[4696]: I0318 15:58:06.992440 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-scripts\") pod \"ceilometer-0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " pod="openstack/ceilometer-0" Mar 18 15:58:07 crc kubenswrapper[4696]: I0318 15:58:07.007268 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkwwx\" (UniqueName: \"kubernetes.io/projected/4c585254-06f5-4cf1-bc73-27d3b04795f0-kube-api-access-tkwwx\") pod \"ceilometer-0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " pod="openstack/ceilometer-0" Mar 18 15:58:07 crc kubenswrapper[4696]: I0318 15:58:07.256514 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 15:58:07 crc kubenswrapper[4696]: I0318 15:58:07.265918 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 15:58:07 crc kubenswrapper[4696]: I0318 15:58:07.557432 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e","Type":"ContainerStarted","Data":"0926e7bfc3212089e92871c1f8ea636731d07c60d09dd34ee25d96bd7875693a"} Mar 18 15:58:07 crc kubenswrapper[4696]: I0318 15:58:07.617721 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44b79cc9-0ed9-4f84-a352-fd6ba6d28daf" path="/var/lib/kubelet/pods/44b79cc9-0ed9-4f84-a352-fd6ba6d28daf/volumes" Mar 18 15:58:07 crc kubenswrapper[4696]: I0318 15:58:07.618641 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e295c1a-9787-42a3-ac9c-5252bda652b5" path="/var/lib/kubelet/pods/6e295c1a-9787-42a3-ac9c-5252bda652b5/volumes" Mar 18 15:58:07 crc kubenswrapper[4696]: I0318 15:58:07.619883 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="929544d7-d084-4e0c-bfa6-442fbd5a3ab4" path="/var/lib/kubelet/pods/929544d7-d084-4e0c-bfa6-442fbd5a3ab4/volumes" Mar 18 15:58:07 crc kubenswrapper[4696]: I0318 15:58:07.773026 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:58:07 crc kubenswrapper[4696]: W0318 15:58:07.789778 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c585254_06f5_4cf1_bc73_27d3b04795f0.slice/crio-5941e3060510fb9db038e5517fcbbe5053e7554660cf57030e70eaef3422b99b WatchSource:0}: Error finding container 5941e3060510fb9db038e5517fcbbe5053e7554660cf57030e70eaef3422b99b: Status 404 returned error can't find the container with id 5941e3060510fb9db038e5517fcbbe5053e7554660cf57030e70eaef3422b99b Mar 18 15:58:08 crc kubenswrapper[4696]: I0318 15:58:08.582405 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c585254-06f5-4cf1-bc73-27d3b04795f0","Type":"ContainerStarted","Data":"5941e3060510fb9db038e5517fcbbe5053e7554660cf57030e70eaef3422b99b"} Mar 18 15:58:09 crc kubenswrapper[4696]: I0318 15:58:09.634789 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c585254-06f5-4cf1-bc73-27d3b04795f0","Type":"ContainerStarted","Data":"a9112436017efdb616a37c02b816428af03126e10676e4a36078d2e5dcfbc66d"} Mar 18 15:58:10 crc kubenswrapper[4696]: I0318 15:58:10.632684 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c585254-06f5-4cf1-bc73-27d3b04795f0","Type":"ContainerStarted","Data":"d5eff175377934b3e9d69dbf27c4bb389f60c939fba10320f826cdc68494f2e4"} Mar 18 15:58:10 crc kubenswrapper[4696]: I0318 15:58:10.633509 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c585254-06f5-4cf1-bc73-27d3b04795f0","Type":"ContainerStarted","Data":"b973703b54594b73c9c3907eac11c2ed9bd05b0c5e256c40521fdda0988701cd"} Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.371383 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.683779 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6847f4969-jlnz4"] Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.685781 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.688925 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.689242 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.692493 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.704222 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6847f4969-jlnz4"] Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.798746 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2418339a-4137-4f64-b098-f0e5011d3f61-public-tls-certs\") pod \"swift-proxy-6847f4969-jlnz4\" (UID: \"2418339a-4137-4f64-b098-f0e5011d3f61\") " pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.798889 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2418339a-4137-4f64-b098-f0e5011d3f61-log-httpd\") pod \"swift-proxy-6847f4969-jlnz4\" (UID: \"2418339a-4137-4f64-b098-f0e5011d3f61\") " pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.798950 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2418339a-4137-4f64-b098-f0e5011d3f61-etc-swift\") pod \"swift-proxy-6847f4969-jlnz4\" (UID: \"2418339a-4137-4f64-b098-f0e5011d3f61\") " pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.798980 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2418339a-4137-4f64-b098-f0e5011d3f61-internal-tls-certs\") pod \"swift-proxy-6847f4969-jlnz4\" (UID: \"2418339a-4137-4f64-b098-f0e5011d3f61\") " pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.799007 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwnk5\" (UniqueName: \"kubernetes.io/projected/2418339a-4137-4f64-b098-f0e5011d3f61-kube-api-access-jwnk5\") pod \"swift-proxy-6847f4969-jlnz4\" (UID: \"2418339a-4137-4f64-b098-f0e5011d3f61\") " pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.799025 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2418339a-4137-4f64-b098-f0e5011d3f61-combined-ca-bundle\") pod \"swift-proxy-6847f4969-jlnz4\" (UID: \"2418339a-4137-4f64-b098-f0e5011d3f61\") " pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.799051 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2418339a-4137-4f64-b098-f0e5011d3f61-run-httpd\") pod \"swift-proxy-6847f4969-jlnz4\" (UID: \"2418339a-4137-4f64-b098-f0e5011d3f61\") " pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.799074 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2418339a-4137-4f64-b098-f0e5011d3f61-config-data\") pod \"swift-proxy-6847f4969-jlnz4\" (UID: \"2418339a-4137-4f64-b098-f0e5011d3f61\") " pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.901492 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2418339a-4137-4f64-b098-f0e5011d3f61-log-httpd\") pod \"swift-proxy-6847f4969-jlnz4\" (UID: \"2418339a-4137-4f64-b098-f0e5011d3f61\") " pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.902115 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2418339a-4137-4f64-b098-f0e5011d3f61-etc-swift\") pod \"swift-proxy-6847f4969-jlnz4\" (UID: \"2418339a-4137-4f64-b098-f0e5011d3f61\") " pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.902171 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2418339a-4137-4f64-b098-f0e5011d3f61-internal-tls-certs\") pod \"swift-proxy-6847f4969-jlnz4\" (UID: \"2418339a-4137-4f64-b098-f0e5011d3f61\") " pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.902220 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwnk5\" (UniqueName: \"kubernetes.io/projected/2418339a-4137-4f64-b098-f0e5011d3f61-kube-api-access-jwnk5\") pod \"swift-proxy-6847f4969-jlnz4\" (UID: \"2418339a-4137-4f64-b098-f0e5011d3f61\") " pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.902244 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2418339a-4137-4f64-b098-f0e5011d3f61-combined-ca-bundle\") pod \"swift-proxy-6847f4969-jlnz4\" (UID: \"2418339a-4137-4f64-b098-f0e5011d3f61\") " pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.902277 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2418339a-4137-4f64-b098-f0e5011d3f61-run-httpd\") pod \"swift-proxy-6847f4969-jlnz4\" (UID: \"2418339a-4137-4f64-b098-f0e5011d3f61\") " pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.902306 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2418339a-4137-4f64-b098-f0e5011d3f61-config-data\") pod \"swift-proxy-6847f4969-jlnz4\" (UID: \"2418339a-4137-4f64-b098-f0e5011d3f61\") " pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.902371 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2418339a-4137-4f64-b098-f0e5011d3f61-public-tls-certs\") pod \"swift-proxy-6847f4969-jlnz4\" (UID: \"2418339a-4137-4f64-b098-f0e5011d3f61\") " pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.902610 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2418339a-4137-4f64-b098-f0e5011d3f61-log-httpd\") pod \"swift-proxy-6847f4969-jlnz4\" (UID: \"2418339a-4137-4f64-b098-f0e5011d3f61\") " pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.903343 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2418339a-4137-4f64-b098-f0e5011d3f61-run-httpd\") pod \"swift-proxy-6847f4969-jlnz4\" (UID: \"2418339a-4137-4f64-b098-f0e5011d3f61\") " pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.914442 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2418339a-4137-4f64-b098-f0e5011d3f61-etc-swift\") pod \"swift-proxy-6847f4969-jlnz4\" (UID: \"2418339a-4137-4f64-b098-f0e5011d3f61\") " pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.914908 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2418339a-4137-4f64-b098-f0e5011d3f61-combined-ca-bundle\") pod \"swift-proxy-6847f4969-jlnz4\" (UID: \"2418339a-4137-4f64-b098-f0e5011d3f61\") " pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.915310 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2418339a-4137-4f64-b098-f0e5011d3f61-config-data\") pod \"swift-proxy-6847f4969-jlnz4\" (UID: \"2418339a-4137-4f64-b098-f0e5011d3f61\") " pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.921402 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2418339a-4137-4f64-b098-f0e5011d3f61-internal-tls-certs\") pod \"swift-proxy-6847f4969-jlnz4\" (UID: \"2418339a-4137-4f64-b098-f0e5011d3f61\") " pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.929453 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwnk5\" (UniqueName: \"kubernetes.io/projected/2418339a-4137-4f64-b098-f0e5011d3f61-kube-api-access-jwnk5\") pod \"swift-proxy-6847f4969-jlnz4\" (UID: \"2418339a-4137-4f64-b098-f0e5011d3f61\") " pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:11 crc kubenswrapper[4696]: I0318 15:58:11.929706 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2418339a-4137-4f64-b098-f0e5011d3f61-public-tls-certs\") pod \"swift-proxy-6847f4969-jlnz4\" (UID: \"2418339a-4137-4f64-b098-f0e5011d3f61\") " pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:12 crc kubenswrapper[4696]: I0318 15:58:12.039766 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:12 crc kubenswrapper[4696]: I0318 15:58:12.219941 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 15:58:16 crc kubenswrapper[4696]: I0318 15:58:16.449903 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-696476876d-4rxz2" podUID="72021b21-00cf-4c33-be2d-b24f20dc0f9f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 18 15:58:18 crc kubenswrapper[4696]: I0318 15:58:18.578142 4696 scope.go:117] "RemoveContainer" containerID="85d4d89a691ba2e1380ae2e2087cb08448358e08f1a924394cc63b30535917b9" Mar 18 15:58:18 crc kubenswrapper[4696]: I0318 15:58:18.853576 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6847f4969-jlnz4"] Mar 18 15:58:18 crc kubenswrapper[4696]: W0318 15:58:18.923045 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2418339a_4137_4f64_b098_f0e5011d3f61.slice/crio-05cbf9951bab3c5369fa97e1199a9da523b0d1647a5ca71c7a2eae8010f126d1 WatchSource:0}: Error finding container 05cbf9951bab3c5369fa97e1199a9da523b0d1647a5ca71c7a2eae8010f126d1: Status 404 returned error can't find the container with id 05cbf9951bab3c5369fa97e1199a9da523b0d1647a5ca71c7a2eae8010f126d1 Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.331557 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-4qnw5"] Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.333332 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4qnw5" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.350640 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4qnw5"] Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.417639 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsm8s\" (UniqueName: \"kubernetes.io/projected/b1ce5a9d-83ad-459d-b149-25d0760e14e2-kube-api-access-lsm8s\") pod \"nova-api-db-create-4qnw5\" (UID: \"b1ce5a9d-83ad-459d-b149-25d0760e14e2\") " pod="openstack/nova-api-db-create-4qnw5" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.418220 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1ce5a9d-83ad-459d-b149-25d0760e14e2-operator-scripts\") pod \"nova-api-db-create-4qnw5\" (UID: \"b1ce5a9d-83ad-459d-b149-25d0760e14e2\") " pod="openstack/nova-api-db-create-4qnw5" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.461271 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-xpnvg"] Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.462890 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xpnvg" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.481267 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-xpnvg"] Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.522709 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1ce5a9d-83ad-459d-b149-25d0760e14e2-operator-scripts\") pod \"nova-api-db-create-4qnw5\" (UID: \"b1ce5a9d-83ad-459d-b149-25d0760e14e2\") " pod="openstack/nova-api-db-create-4qnw5" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.522844 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsm8s\" (UniqueName: \"kubernetes.io/projected/b1ce5a9d-83ad-459d-b149-25d0760e14e2-kube-api-access-lsm8s\") pod \"nova-api-db-create-4qnw5\" (UID: \"b1ce5a9d-83ad-459d-b149-25d0760e14e2\") " pod="openstack/nova-api-db-create-4qnw5" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.523792 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1ce5a9d-83ad-459d-b149-25d0760e14e2-operator-scripts\") pod \"nova-api-db-create-4qnw5\" (UID: \"b1ce5a9d-83ad-459d-b149-25d0760e14e2\") " pod="openstack/nova-api-db-create-4qnw5" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.542586 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-t9mvs"] Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.545354 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t9mvs" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.555811 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsm8s\" (UniqueName: \"kubernetes.io/projected/b1ce5a9d-83ad-459d-b149-25d0760e14e2-kube-api-access-lsm8s\") pod \"nova-api-db-create-4qnw5\" (UID: \"b1ce5a9d-83ad-459d-b149-25d0760e14e2\") " pod="openstack/nova-api-db-create-4qnw5" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.577080 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-t9mvs"] Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.586758 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-7121-account-create-update-45gq6"] Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.588331 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7121-account-create-update-45gq6" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.591742 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.595878 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-7121-account-create-update-45gq6"] Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.628234 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfslm\" (UniqueName: \"kubernetes.io/projected/a539865a-1121-432b-909b-617f805460d9-kube-api-access-zfslm\") pod \"nova-cell0-db-create-xpnvg\" (UID: \"a539865a-1121-432b-909b-617f805460d9\") " pod="openstack/nova-cell0-db-create-xpnvg" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.628702 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a539865a-1121-432b-909b-617f805460d9-operator-scripts\") pod \"nova-cell0-db-create-xpnvg\" (UID: \"a539865a-1121-432b-909b-617f805460d9\") " pod="openstack/nova-cell0-db-create-xpnvg" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.628733 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn8g4\" (UniqueName: \"kubernetes.io/projected/251f8d1d-fe68-479f-9af1-ec8c288b3524-kube-api-access-bn8g4\") pod \"nova-cell1-db-create-t9mvs\" (UID: \"251f8d1d-fe68-479f-9af1-ec8c288b3524\") " pod="openstack/nova-cell1-db-create-t9mvs" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.628910 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/251f8d1d-fe68-479f-9af1-ec8c288b3524-operator-scripts\") pod \"nova-cell1-db-create-t9mvs\" (UID: \"251f8d1d-fe68-479f-9af1-ec8c288b3524\") " pod="openstack/nova-cell1-db-create-t9mvs" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.724509 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4qnw5" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.731316 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5rc5\" (UniqueName: \"kubernetes.io/projected/2b054f24-437a-4095-b1c4-c7e20b1e68d5-kube-api-access-z5rc5\") pod \"nova-api-7121-account-create-update-45gq6\" (UID: \"2b054f24-437a-4095-b1c4-c7e20b1e68d5\") " pod="openstack/nova-api-7121-account-create-update-45gq6" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.731472 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfslm\" (UniqueName: \"kubernetes.io/projected/a539865a-1121-432b-909b-617f805460d9-kube-api-access-zfslm\") pod \"nova-cell0-db-create-xpnvg\" (UID: \"a539865a-1121-432b-909b-617f805460d9\") " pod="openstack/nova-cell0-db-create-xpnvg" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.731538 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a539865a-1121-432b-909b-617f805460d9-operator-scripts\") pod \"nova-cell0-db-create-xpnvg\" (UID: \"a539865a-1121-432b-909b-617f805460d9\") " pod="openstack/nova-cell0-db-create-xpnvg" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.731567 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn8g4\" (UniqueName: \"kubernetes.io/projected/251f8d1d-fe68-479f-9af1-ec8c288b3524-kube-api-access-bn8g4\") pod \"nova-cell1-db-create-t9mvs\" (UID: \"251f8d1d-fe68-479f-9af1-ec8c288b3524\") " pod="openstack/nova-cell1-db-create-t9mvs" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.731654 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/251f8d1d-fe68-479f-9af1-ec8c288b3524-operator-scripts\") pod \"nova-cell1-db-create-t9mvs\" (UID: \"251f8d1d-fe68-479f-9af1-ec8c288b3524\") " pod="openstack/nova-cell1-db-create-t9mvs" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.731687 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b054f24-437a-4095-b1c4-c7e20b1e68d5-operator-scripts\") pod \"nova-api-7121-account-create-update-45gq6\" (UID: \"2b054f24-437a-4095-b1c4-c7e20b1e68d5\") " pod="openstack/nova-api-7121-account-create-update-45gq6" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.732665 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/251f8d1d-fe68-479f-9af1-ec8c288b3524-operator-scripts\") pod \"nova-cell1-db-create-t9mvs\" (UID: \"251f8d1d-fe68-479f-9af1-ec8c288b3524\") " pod="openstack/nova-cell1-db-create-t9mvs" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.732886 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a539865a-1121-432b-909b-617f805460d9-operator-scripts\") pod \"nova-cell0-db-create-xpnvg\" (UID: \"a539865a-1121-432b-909b-617f805460d9\") " pod="openstack/nova-cell0-db-create-xpnvg" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.749142 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-726d-account-create-update-c8c2t"] Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.752659 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-726d-account-create-update-c8c2t" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.755144 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.763354 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfslm\" (UniqueName: \"kubernetes.io/projected/a539865a-1121-432b-909b-617f805460d9-kube-api-access-zfslm\") pod \"nova-cell0-db-create-xpnvg\" (UID: \"a539865a-1121-432b-909b-617f805460d9\") " pod="openstack/nova-cell0-db-create-xpnvg" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.764043 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn8g4\" (UniqueName: \"kubernetes.io/projected/251f8d1d-fe68-479f-9af1-ec8c288b3524-kube-api-access-bn8g4\") pod \"nova-cell1-db-create-t9mvs\" (UID: \"251f8d1d-fe68-479f-9af1-ec8c288b3524\") " pod="openstack/nova-cell1-db-create-t9mvs" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.775479 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-726d-account-create-update-c8c2t"] Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.783275 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6847f4969-jlnz4" event={"ID":"2418339a-4137-4f64-b098-f0e5011d3f61","Type":"ContainerStarted","Data":"05cbf9951bab3c5369fa97e1199a9da523b0d1647a5ca71c7a2eae8010f126d1"} Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.799882 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xpnvg" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.834341 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5rc5\" (UniqueName: \"kubernetes.io/projected/2b054f24-437a-4095-b1c4-c7e20b1e68d5-kube-api-access-z5rc5\") pod \"nova-api-7121-account-create-update-45gq6\" (UID: \"2b054f24-437a-4095-b1c4-c7e20b1e68d5\") " pod="openstack/nova-api-7121-account-create-update-45gq6" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.834560 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/159d7aa7-449a-4e20-940c-f8c6f6fc88f0-operator-scripts\") pod \"nova-cell0-726d-account-create-update-c8c2t\" (UID: \"159d7aa7-449a-4e20-940c-f8c6f6fc88f0\") " pod="openstack/nova-cell0-726d-account-create-update-c8c2t" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.834769 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b054f24-437a-4095-b1c4-c7e20b1e68d5-operator-scripts\") pod \"nova-api-7121-account-create-update-45gq6\" (UID: \"2b054f24-437a-4095-b1c4-c7e20b1e68d5\") " pod="openstack/nova-api-7121-account-create-update-45gq6" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.835659 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b054f24-437a-4095-b1c4-c7e20b1e68d5-operator-scripts\") pod \"nova-api-7121-account-create-update-45gq6\" (UID: \"2b054f24-437a-4095-b1c4-c7e20b1e68d5\") " pod="openstack/nova-api-7121-account-create-update-45gq6" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.835777 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp89p\" (UniqueName: \"kubernetes.io/projected/159d7aa7-449a-4e20-940c-f8c6f6fc88f0-kube-api-access-wp89p\") pod \"nova-cell0-726d-account-create-update-c8c2t\" (UID: \"159d7aa7-449a-4e20-940c-f8c6f6fc88f0\") " pod="openstack/nova-cell0-726d-account-create-update-c8c2t" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.862898 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5rc5\" (UniqueName: \"kubernetes.io/projected/2b054f24-437a-4095-b1c4-c7e20b1e68d5-kube-api-access-z5rc5\") pod \"nova-api-7121-account-create-update-45gq6\" (UID: \"2b054f24-437a-4095-b1c4-c7e20b1e68d5\") " pod="openstack/nova-api-7121-account-create-update-45gq6" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.938374 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp89p\" (UniqueName: \"kubernetes.io/projected/159d7aa7-449a-4e20-940c-f8c6f6fc88f0-kube-api-access-wp89p\") pod \"nova-cell0-726d-account-create-update-c8c2t\" (UID: \"159d7aa7-449a-4e20-940c-f8c6f6fc88f0\") " pod="openstack/nova-cell0-726d-account-create-update-c8c2t" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.938505 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/159d7aa7-449a-4e20-940c-f8c6f6fc88f0-operator-scripts\") pod \"nova-cell0-726d-account-create-update-c8c2t\" (UID: \"159d7aa7-449a-4e20-940c-f8c6f6fc88f0\") " pod="openstack/nova-cell0-726d-account-create-update-c8c2t" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.939354 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/159d7aa7-449a-4e20-940c-f8c6f6fc88f0-operator-scripts\") pod \"nova-cell0-726d-account-create-update-c8c2t\" (UID: \"159d7aa7-449a-4e20-940c-f8c6f6fc88f0\") " pod="openstack/nova-cell0-726d-account-create-update-c8c2t" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.943157 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t9mvs" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.958163 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7121-account-create-update-45gq6" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.970810 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp89p\" (UniqueName: \"kubernetes.io/projected/159d7aa7-449a-4e20-940c-f8c6f6fc88f0-kube-api-access-wp89p\") pod \"nova-cell0-726d-account-create-update-c8c2t\" (UID: \"159d7aa7-449a-4e20-940c-f8c6f6fc88f0\") " pod="openstack/nova-cell0-726d-account-create-update-c8c2t" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.978819 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-e7be-account-create-update-v2pr7"] Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.980792 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e7be-account-create-update-v2pr7" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.985130 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 18 15:58:19 crc kubenswrapper[4696]: I0318 15:58:19.991180 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e7be-account-create-update-v2pr7"] Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.146366 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p7xb\" (UniqueName: \"kubernetes.io/projected/f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3-kube-api-access-2p7xb\") pod \"nova-cell1-e7be-account-create-update-v2pr7\" (UID: \"f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3\") " pod="openstack/nova-cell1-e7be-account-create-update-v2pr7" Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.149569 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3-operator-scripts\") pod \"nova-cell1-e7be-account-create-update-v2pr7\" (UID: \"f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3\") " pod="openstack/nova-cell1-e7be-account-create-update-v2pr7" Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.231213 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-726d-account-create-update-c8c2t" Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.252604 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3-operator-scripts\") pod \"nova-cell1-e7be-account-create-update-v2pr7\" (UID: \"f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3\") " pod="openstack/nova-cell1-e7be-account-create-update-v2pr7" Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.251890 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3-operator-scripts\") pod \"nova-cell1-e7be-account-create-update-v2pr7\" (UID: \"f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3\") " pod="openstack/nova-cell1-e7be-account-create-update-v2pr7" Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.253043 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p7xb\" (UniqueName: \"kubernetes.io/projected/f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3-kube-api-access-2p7xb\") pod \"nova-cell1-e7be-account-create-update-v2pr7\" (UID: \"f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3\") " pod="openstack/nova-cell1-e7be-account-create-update-v2pr7" Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.273557 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p7xb\" (UniqueName: \"kubernetes.io/projected/f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3-kube-api-access-2p7xb\") pod \"nova-cell1-e7be-account-create-update-v2pr7\" (UID: \"f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3\") " pod="openstack/nova-cell1-e7be-account-create-update-v2pr7" Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.315470 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e7be-account-create-update-v2pr7" Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.391850 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4qnw5"] Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.498797 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-xpnvg"] Mar 18 15:58:20 crc kubenswrapper[4696]: W0318 15:58:20.576798 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda539865a_1121_432b_909b_617f805460d9.slice/crio-6b1638c7b82f6b4eb511fc00935330871749a2330c05bc1642111ce3e2b19c5b WatchSource:0}: Error finding container 6b1638c7b82f6b4eb511fc00935330871749a2330c05bc1642111ce3e2b19c5b: Status 404 returned error can't find the container with id 6b1638c7b82f6b4eb511fc00935330871749a2330c05bc1642111ce3e2b19c5b Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.752467 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-7121-account-create-update-45gq6"] Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.809404 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7121-account-create-update-45gq6" event={"ID":"2b054f24-437a-4095-b1c4-c7e20b1e68d5","Type":"ContainerStarted","Data":"d04c072eeb2830291cd77e6806edacc64fe2ff03420f04df1378dccaf2af478d"} Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.820281 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xpnvg" event={"ID":"a539865a-1121-432b-909b-617f805460d9","Type":"ContainerStarted","Data":"6b1638c7b82f6b4eb511fc00935330871749a2330c05bc1642111ce3e2b19c5b"} Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.829768 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c585254-06f5-4cf1-bc73-27d3b04795f0","Type":"ContainerStarted","Data":"df120cc5d308cf571b9ffea895baf15f02faa145a45dc7b995de9fe9bce01129"} Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.830026 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4c585254-06f5-4cf1-bc73-27d3b04795f0" containerName="ceilometer-central-agent" containerID="cri-o://a9112436017efdb616a37c02b816428af03126e10676e4a36078d2e5dcfbc66d" gracePeriod=30 Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.830342 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.830716 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4c585254-06f5-4cf1-bc73-27d3b04795f0" containerName="proxy-httpd" containerID="cri-o://df120cc5d308cf571b9ffea895baf15f02faa145a45dc7b995de9fe9bce01129" gracePeriod=30 Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.830768 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4c585254-06f5-4cf1-bc73-27d3b04795f0" containerName="sg-core" containerID="cri-o://d5eff175377934b3e9d69dbf27c4bb389f60c939fba10320f826cdc68494f2e4" gracePeriod=30 Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.830837 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4c585254-06f5-4cf1-bc73-27d3b04795f0" containerName="ceilometer-notification-agent" containerID="cri-o://b973703b54594b73c9c3907eac11c2ed9bd05b0c5e256c40521fdda0988701cd" gracePeriod=30 Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.835165 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4qnw5" event={"ID":"b1ce5a9d-83ad-459d-b149-25d0760e14e2","Type":"ContainerStarted","Data":"363f10896e030c04f1e8b8d543a0b2a5e61038756982ab17792857a35ac69b42"} Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.854241 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6847f4969-jlnz4" event={"ID":"2418339a-4137-4f64-b098-f0e5011d3f61","Type":"ContainerStarted","Data":"bbbd172488c349cd493d511d7e8f606f399451cc2b3d86a606d9adf7c8cc484b"} Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.854305 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6847f4969-jlnz4" event={"ID":"2418339a-4137-4f64-b098-f0e5011d3f61","Type":"ContainerStarted","Data":"1221f552034b0d9ed64f395e02fe7af46e7d7ce82ee7b8f53c8ed9fbe4ea642c"} Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.854621 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.854738 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.867470 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e","Type":"ContainerStarted","Data":"d268a76e9825792254885c7003b14b253d676b08684a4a1bbad7dbbb2087de2e"} Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.886510 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-t9mvs"] Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.894707 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.511624091 podStartE2EDuration="14.89466497s" podCreationTimestamp="2026-03-18 15:58:06 +0000 UTC" firstStartedPulling="2026-03-18 15:58:07.792986714 +0000 UTC m=+1330.799160920" lastFinishedPulling="2026-03-18 15:58:20.176027603 +0000 UTC m=+1343.182201799" observedRunningTime="2026-03-18 15:58:20.868358191 +0000 UTC m=+1343.874532397" watchObservedRunningTime="2026-03-18 15:58:20.89466497 +0000 UTC m=+1343.900839176" Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.914015 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6847f4969-jlnz4" podStartSLOduration=9.913990384 podStartE2EDuration="9.913990384s" podCreationTimestamp="2026-03-18 15:58:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:58:20.913323617 +0000 UTC m=+1343.919497823" watchObservedRunningTime="2026-03-18 15:58:20.913990384 +0000 UTC m=+1343.920164590" Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.960699 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.093609656 podStartE2EDuration="14.960667033s" podCreationTimestamp="2026-03-18 15:58:06 +0000 UTC" firstStartedPulling="2026-03-18 15:58:07.279179429 +0000 UTC m=+1330.285353635" lastFinishedPulling="2026-03-18 15:58:20.146236806 +0000 UTC m=+1343.152411012" observedRunningTime="2026-03-18 15:58:20.942339324 +0000 UTC m=+1343.948513530" watchObservedRunningTime="2026-03-18 15:58:20.960667033 +0000 UTC m=+1343.966841239" Mar 18 15:58:20 crc kubenswrapper[4696]: I0318 15:58:20.987235 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-726d-account-create-update-c8c2t"] Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.054392 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e7be-account-create-update-v2pr7"] Mar 18 15:58:21 crc kubenswrapper[4696]: W0318 15:58:21.074031 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2a154bb_e9c5_44ce_bf27_7bdf87c86ea3.slice/crio-26e05ce7c6abb1b7cfed0683690731ec3510860f88415b780a982ebcc2b87620 WatchSource:0}: Error finding container 26e05ce7c6abb1b7cfed0683690731ec3510860f88415b780a982ebcc2b87620: Status 404 returned error can't find the container with id 26e05ce7c6abb1b7cfed0683690731ec3510860f88415b780a982ebcc2b87620 Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.514005 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-667fb94989-br52g" Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.580815 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6fd69d9b-g5mkf"] Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.581140 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6fd69d9b-g5mkf" podUID="8a89cfa6-c720-40a1-aaa2-cfe62c153c14" containerName="neutron-api" containerID="cri-o://1618e915cd969c52f18b76321c8af6b8917af2d7b0bdb02f3cf23c588c6c9e67" gracePeriod=30 Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.581808 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6fd69d9b-g5mkf" podUID="8a89cfa6-c720-40a1-aaa2-cfe62c153c14" containerName="neutron-httpd" containerID="cri-o://3ed8baf76939182337510c8296bf423be9f525c652717f15ced34e5b2a0098f6" gracePeriod=30 Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.889830 4696 generic.go:334] "Generic (PLEG): container finished" podID="4c585254-06f5-4cf1-bc73-27d3b04795f0" containerID="d5eff175377934b3e9d69dbf27c4bb389f60c939fba10320f826cdc68494f2e4" exitCode=2 Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.890250 4696 generic.go:334] "Generic (PLEG): container finished" podID="4c585254-06f5-4cf1-bc73-27d3b04795f0" containerID="a9112436017efdb616a37c02b816428af03126e10676e4a36078d2e5dcfbc66d" exitCode=0 Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.890332 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c585254-06f5-4cf1-bc73-27d3b04795f0","Type":"ContainerDied","Data":"d5eff175377934b3e9d69dbf27c4bb389f60c939fba10320f826cdc68494f2e4"} Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.890369 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c585254-06f5-4cf1-bc73-27d3b04795f0","Type":"ContainerDied","Data":"a9112436017efdb616a37c02b816428af03126e10676e4a36078d2e5dcfbc66d"} Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.895115 4696 generic.go:334] "Generic (PLEG): container finished" podID="159d7aa7-449a-4e20-940c-f8c6f6fc88f0" containerID="30b707128ffd2831d6f66974959577976d5c642f27a6dd268b6f74821dea499d" exitCode=0 Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.895197 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-726d-account-create-update-c8c2t" event={"ID":"159d7aa7-449a-4e20-940c-f8c6f6fc88f0","Type":"ContainerDied","Data":"30b707128ffd2831d6f66974959577976d5c642f27a6dd268b6f74821dea499d"} Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.895229 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-726d-account-create-update-c8c2t" event={"ID":"159d7aa7-449a-4e20-940c-f8c6f6fc88f0","Type":"ContainerStarted","Data":"d83b1b2f74d2b97500397809d909096d357d91191590617008dcc696f40ded02"} Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.922329 4696 generic.go:334] "Generic (PLEG): container finished" podID="b1ce5a9d-83ad-459d-b149-25d0760e14e2" containerID="93559a5afc6b0d765466ad8d35a5ad9e433da98933e1fd3b809da89c53f877f7" exitCode=0 Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.922438 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4qnw5" event={"ID":"b1ce5a9d-83ad-459d-b149-25d0760e14e2","Type":"ContainerDied","Data":"93559a5afc6b0d765466ad8d35a5ad9e433da98933e1fd3b809da89c53f877f7"} Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.927803 4696 generic.go:334] "Generic (PLEG): container finished" podID="251f8d1d-fe68-479f-9af1-ec8c288b3524" containerID="b3e300c1debfef4904b1384978e21660987ce8668b8d5bc6645377fc4fd68e6d" exitCode=0 Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.927875 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t9mvs" event={"ID":"251f8d1d-fe68-479f-9af1-ec8c288b3524","Type":"ContainerDied","Data":"b3e300c1debfef4904b1384978e21660987ce8668b8d5bc6645377fc4fd68e6d"} Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.927904 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t9mvs" event={"ID":"251f8d1d-fe68-479f-9af1-ec8c288b3524","Type":"ContainerStarted","Data":"2dcfaf23965b0dec4e49ef8acb8ec4be29d7b32ccdaf6f19823cab01f9726be9"} Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.933582 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7121-account-create-update-45gq6" event={"ID":"2b054f24-437a-4095-b1c4-c7e20b1e68d5","Type":"ContainerDied","Data":"c376fef3ac9ee5e7617d35469707ccc25afd0355f3ebca22d225c423b65f6df4"} Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.933868 4696 generic.go:334] "Generic (PLEG): container finished" podID="2b054f24-437a-4095-b1c4-c7e20b1e68d5" containerID="c376fef3ac9ee5e7617d35469707ccc25afd0355f3ebca22d225c423b65f6df4" exitCode=0 Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.939035 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e7be-account-create-update-v2pr7" event={"ID":"f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3","Type":"ContainerStarted","Data":"c581e429fcd7c983f88db67db759762819b5e9dd6af085b22d1a439d1d6e1076"} Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.939081 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e7be-account-create-update-v2pr7" event={"ID":"f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3","Type":"ContainerStarted","Data":"26e05ce7c6abb1b7cfed0683690731ec3510860f88415b780a982ebcc2b87620"} Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.942637 4696 generic.go:334] "Generic (PLEG): container finished" podID="a539865a-1121-432b-909b-617f805460d9" containerID="8caa3000254ac99a134d89c5d4235ddc93922c7ed7b9e1bdc27fe7ff7fa9305d" exitCode=0 Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.942733 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xpnvg" event={"ID":"a539865a-1121-432b-909b-617f805460d9","Type":"ContainerDied","Data":"8caa3000254ac99a134d89c5d4235ddc93922c7ed7b9e1bdc27fe7ff7fa9305d"} Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.945280 4696 generic.go:334] "Generic (PLEG): container finished" podID="8a89cfa6-c720-40a1-aaa2-cfe62c153c14" containerID="3ed8baf76939182337510c8296bf423be9f525c652717f15ced34e5b2a0098f6" exitCode=0 Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.946393 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fd69d9b-g5mkf" event={"ID":"8a89cfa6-c720-40a1-aaa2-cfe62c153c14","Type":"ContainerDied","Data":"3ed8baf76939182337510c8296bf423be9f525c652717f15ced34e5b2a0098f6"} Mar 18 15:58:21 crc kubenswrapper[4696]: I0318 15:58:21.994196 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-e7be-account-create-update-v2pr7" podStartSLOduration=2.994131769 podStartE2EDuration="2.994131769s" podCreationTimestamp="2026-03-18 15:58:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:58:21.989743799 +0000 UTC m=+1344.995918015" watchObservedRunningTime="2026-03-18 15:58:21.994131769 +0000 UTC m=+1345.000305965" Mar 18 15:58:22 crc kubenswrapper[4696]: I0318 15:58:22.977459 4696 generic.go:334] "Generic (PLEG): container finished" podID="f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3" containerID="c581e429fcd7c983f88db67db759762819b5e9dd6af085b22d1a439d1d6e1076" exitCode=0 Mar 18 15:58:22 crc kubenswrapper[4696]: I0318 15:58:22.977641 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e7be-account-create-update-v2pr7" event={"ID":"f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3","Type":"ContainerDied","Data":"c581e429fcd7c983f88db67db759762819b5e9dd6af085b22d1a439d1d6e1076"} Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.489614 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7121-account-create-update-45gq6" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.668457 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4qnw5" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.682850 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xpnvg" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.690834 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b054f24-437a-4095-b1c4-c7e20b1e68d5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2b054f24-437a-4095-b1c4-c7e20b1e68d5" (UID: "2b054f24-437a-4095-b1c4-c7e20b1e68d5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.691149 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b054f24-437a-4095-b1c4-c7e20b1e68d5-operator-scripts\") pod \"2b054f24-437a-4095-b1c4-c7e20b1e68d5\" (UID: \"2b054f24-437a-4095-b1c4-c7e20b1e68d5\") " Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.691684 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5rc5\" (UniqueName: \"kubernetes.io/projected/2b054f24-437a-4095-b1c4-c7e20b1e68d5-kube-api-access-z5rc5\") pod \"2b054f24-437a-4095-b1c4-c7e20b1e68d5\" (UID: \"2b054f24-437a-4095-b1c4-c7e20b1e68d5\") " Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.693117 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b054f24-437a-4095-b1c4-c7e20b1e68d5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.698073 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t9mvs" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.701553 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b054f24-437a-4095-b1c4-c7e20b1e68d5-kube-api-access-z5rc5" (OuterVolumeSpecName: "kube-api-access-z5rc5") pod "2b054f24-437a-4095-b1c4-c7e20b1e68d5" (UID: "2b054f24-437a-4095-b1c4-c7e20b1e68d5"). InnerVolumeSpecName "kube-api-access-z5rc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.705767 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-726d-account-create-update-c8c2t" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.794490 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1ce5a9d-83ad-459d-b149-25d0760e14e2-operator-scripts\") pod \"b1ce5a9d-83ad-459d-b149-25d0760e14e2\" (UID: \"b1ce5a9d-83ad-459d-b149-25d0760e14e2\") " Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.794653 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsm8s\" (UniqueName: \"kubernetes.io/projected/b1ce5a9d-83ad-459d-b149-25d0760e14e2-kube-api-access-lsm8s\") pod \"b1ce5a9d-83ad-459d-b149-25d0760e14e2\" (UID: \"b1ce5a9d-83ad-459d-b149-25d0760e14e2\") " Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.794817 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/251f8d1d-fe68-479f-9af1-ec8c288b3524-operator-scripts\") pod \"251f8d1d-fe68-479f-9af1-ec8c288b3524\" (UID: \"251f8d1d-fe68-479f-9af1-ec8c288b3524\") " Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.794848 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn8g4\" (UniqueName: \"kubernetes.io/projected/251f8d1d-fe68-479f-9af1-ec8c288b3524-kube-api-access-bn8g4\") pod \"251f8d1d-fe68-479f-9af1-ec8c288b3524\" (UID: \"251f8d1d-fe68-479f-9af1-ec8c288b3524\") " Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.794870 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfslm\" (UniqueName: \"kubernetes.io/projected/a539865a-1121-432b-909b-617f805460d9-kube-api-access-zfslm\") pod \"a539865a-1121-432b-909b-617f805460d9\" (UID: \"a539865a-1121-432b-909b-617f805460d9\") " Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.794972 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a539865a-1121-432b-909b-617f805460d9-operator-scripts\") pod \"a539865a-1121-432b-909b-617f805460d9\" (UID: \"a539865a-1121-432b-909b-617f805460d9\") " Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.795442 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5rc5\" (UniqueName: \"kubernetes.io/projected/2b054f24-437a-4095-b1c4-c7e20b1e68d5-kube-api-access-z5rc5\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.795621 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1ce5a9d-83ad-459d-b149-25d0760e14e2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b1ce5a9d-83ad-459d-b149-25d0760e14e2" (UID: "b1ce5a9d-83ad-459d-b149-25d0760e14e2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.796079 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/251f8d1d-fe68-479f-9af1-ec8c288b3524-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "251f8d1d-fe68-479f-9af1-ec8c288b3524" (UID: "251f8d1d-fe68-479f-9af1-ec8c288b3524"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.796601 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a539865a-1121-432b-909b-617f805460d9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a539865a-1121-432b-909b-617f805460d9" (UID: "a539865a-1121-432b-909b-617f805460d9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.800072 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1ce5a9d-83ad-459d-b149-25d0760e14e2-kube-api-access-lsm8s" (OuterVolumeSpecName: "kube-api-access-lsm8s") pod "b1ce5a9d-83ad-459d-b149-25d0760e14e2" (UID: "b1ce5a9d-83ad-459d-b149-25d0760e14e2"). InnerVolumeSpecName "kube-api-access-lsm8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.802406 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/251f8d1d-fe68-479f-9af1-ec8c288b3524-kube-api-access-bn8g4" (OuterVolumeSpecName: "kube-api-access-bn8g4") pod "251f8d1d-fe68-479f-9af1-ec8c288b3524" (UID: "251f8d1d-fe68-479f-9af1-ec8c288b3524"). InnerVolumeSpecName "kube-api-access-bn8g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.805610 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a539865a-1121-432b-909b-617f805460d9-kube-api-access-zfslm" (OuterVolumeSpecName: "kube-api-access-zfslm") pod "a539865a-1121-432b-909b-617f805460d9" (UID: "a539865a-1121-432b-909b-617f805460d9"). InnerVolumeSpecName "kube-api-access-zfslm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.896983 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp89p\" (UniqueName: \"kubernetes.io/projected/159d7aa7-449a-4e20-940c-f8c6f6fc88f0-kube-api-access-wp89p\") pod \"159d7aa7-449a-4e20-940c-f8c6f6fc88f0\" (UID: \"159d7aa7-449a-4e20-940c-f8c6f6fc88f0\") " Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.897227 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/159d7aa7-449a-4e20-940c-f8c6f6fc88f0-operator-scripts\") pod \"159d7aa7-449a-4e20-940c-f8c6f6fc88f0\" (UID: \"159d7aa7-449a-4e20-940c-f8c6f6fc88f0\") " Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.897804 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/159d7aa7-449a-4e20-940c-f8c6f6fc88f0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "159d7aa7-449a-4e20-940c-f8c6f6fc88f0" (UID: "159d7aa7-449a-4e20-940c-f8c6f6fc88f0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.898584 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/251f8d1d-fe68-479f-9af1-ec8c288b3524-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.898613 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn8g4\" (UniqueName: \"kubernetes.io/projected/251f8d1d-fe68-479f-9af1-ec8c288b3524-kube-api-access-bn8g4\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.898632 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfslm\" (UniqueName: \"kubernetes.io/projected/a539865a-1121-432b-909b-617f805460d9-kube-api-access-zfslm\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.898649 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a539865a-1121-432b-909b-617f805460d9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.898661 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1ce5a9d-83ad-459d-b149-25d0760e14e2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.898674 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsm8s\" (UniqueName: \"kubernetes.io/projected/b1ce5a9d-83ad-459d-b149-25d0760e14e2-kube-api-access-lsm8s\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.898687 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/159d7aa7-449a-4e20-940c-f8c6f6fc88f0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.904780 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/159d7aa7-449a-4e20-940c-f8c6f6fc88f0-kube-api-access-wp89p" (OuterVolumeSpecName: "kube-api-access-wp89p") pod "159d7aa7-449a-4e20-940c-f8c6f6fc88f0" (UID: "159d7aa7-449a-4e20-940c-f8c6f6fc88f0"). InnerVolumeSpecName "kube-api-access-wp89p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.945367 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.945691 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="76a6f156-f710-4c15-a20f-649b27d7e7d6" containerName="glance-log" containerID="cri-o://5e5ae1c6a7cbc0df617fc4b8a0ecc642e9ec06947dc3b86c1ea6e07b6bc466bf" gracePeriod=30 Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.945789 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="76a6f156-f710-4c15-a20f-649b27d7e7d6" containerName="glance-httpd" containerID="cri-o://b6bfeef15251ff1e947a9ad69fb199e9f96dbcfadb87bc4e3a6bca3d0cce8f79" gracePeriod=30 Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.989628 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t9mvs" event={"ID":"251f8d1d-fe68-479f-9af1-ec8c288b3524","Type":"ContainerDied","Data":"2dcfaf23965b0dec4e49ef8acb8ec4be29d7b32ccdaf6f19823cab01f9726be9"} Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.989682 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dcfaf23965b0dec4e49ef8acb8ec4be29d7b32ccdaf6f19823cab01f9726be9" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.989758 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t9mvs" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.994997 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7121-account-create-update-45gq6" event={"ID":"2b054f24-437a-4095-b1c4-c7e20b1e68d5","Type":"ContainerDied","Data":"d04c072eeb2830291cd77e6806edacc64fe2ff03420f04df1378dccaf2af478d"} Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.995065 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d04c072eeb2830291cd77e6806edacc64fe2ff03420f04df1378dccaf2af478d" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.995028 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7121-account-create-update-45gq6" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.997173 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xpnvg" Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.997166 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xpnvg" event={"ID":"a539865a-1121-432b-909b-617f805460d9","Type":"ContainerDied","Data":"6b1638c7b82f6b4eb511fc00935330871749a2330c05bc1642111ce3e2b19c5b"} Mar 18 15:58:23 crc kubenswrapper[4696]: I0318 15:58:23.997287 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b1638c7b82f6b4eb511fc00935330871749a2330c05bc1642111ce3e2b19c5b" Mar 18 15:58:24 crc kubenswrapper[4696]: I0318 15:58:24.000364 4696 generic.go:334] "Generic (PLEG): container finished" podID="4c585254-06f5-4cf1-bc73-27d3b04795f0" containerID="b973703b54594b73c9c3907eac11c2ed9bd05b0c5e256c40521fdda0988701cd" exitCode=0 Mar 18 15:58:24 crc kubenswrapper[4696]: I0318 15:58:24.000438 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c585254-06f5-4cf1-bc73-27d3b04795f0","Type":"ContainerDied","Data":"b973703b54594b73c9c3907eac11c2ed9bd05b0c5e256c40521fdda0988701cd"} Mar 18 15:58:24 crc kubenswrapper[4696]: I0318 15:58:24.000410 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp89p\" (UniqueName: \"kubernetes.io/projected/159d7aa7-449a-4e20-940c-f8c6f6fc88f0-kube-api-access-wp89p\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:24 crc kubenswrapper[4696]: I0318 15:58:24.013688 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-726d-account-create-update-c8c2t" event={"ID":"159d7aa7-449a-4e20-940c-f8c6f6fc88f0","Type":"ContainerDied","Data":"d83b1b2f74d2b97500397809d909096d357d91191590617008dcc696f40ded02"} Mar 18 15:58:24 crc kubenswrapper[4696]: I0318 15:58:24.013769 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d83b1b2f74d2b97500397809d909096d357d91191590617008dcc696f40ded02" Mar 18 15:58:24 crc kubenswrapper[4696]: I0318 15:58:24.013849 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-726d-account-create-update-c8c2t" Mar 18 15:58:24 crc kubenswrapper[4696]: I0318 15:58:24.016726 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4qnw5" Mar 18 15:58:24 crc kubenswrapper[4696]: I0318 15:58:24.023635 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4qnw5" event={"ID":"b1ce5a9d-83ad-459d-b149-25d0760e14e2","Type":"ContainerDied","Data":"363f10896e030c04f1e8b8d543a0b2a5e61038756982ab17792857a35ac69b42"} Mar 18 15:58:24 crc kubenswrapper[4696]: I0318 15:58:24.023699 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="363f10896e030c04f1e8b8d543a0b2a5e61038756982ab17792857a35ac69b42" Mar 18 15:58:24 crc kubenswrapper[4696]: I0318 15:58:24.436410 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e7be-account-create-update-v2pr7" Mar 18 15:58:24 crc kubenswrapper[4696]: I0318 15:58:24.613467 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3-operator-scripts\") pod \"f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3\" (UID: \"f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3\") " Mar 18 15:58:24 crc kubenswrapper[4696]: I0318 15:58:24.613907 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p7xb\" (UniqueName: \"kubernetes.io/projected/f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3-kube-api-access-2p7xb\") pod \"f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3\" (UID: \"f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3\") " Mar 18 15:58:24 crc kubenswrapper[4696]: I0318 15:58:24.615131 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3" (UID: "f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:24 crc kubenswrapper[4696]: I0318 15:58:24.619565 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3-kube-api-access-2p7xb" (OuterVolumeSpecName: "kube-api-access-2p7xb") pod "f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3" (UID: "f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3"). InnerVolumeSpecName "kube-api-access-2p7xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:24 crc kubenswrapper[4696]: I0318 15:58:24.717334 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p7xb\" (UniqueName: \"kubernetes.io/projected/f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3-kube-api-access-2p7xb\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:24 crc kubenswrapper[4696]: I0318 15:58:24.717833 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:24 crc kubenswrapper[4696]: I0318 15:58:24.881694 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 15:58:24 crc kubenswrapper[4696]: I0318 15:58:24.882073 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="190416e8-e777-4ec5-b017-8b28d749252e" containerName="glance-log" containerID="cri-o://d658a29f107b38ab5ad3e6cc848018faada651586a9cdf0017614ef7dd0e2c30" gracePeriod=30 Mar 18 15:58:24 crc kubenswrapper[4696]: I0318 15:58:24.882208 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="190416e8-e777-4ec5-b017-8b28d749252e" containerName="glance-httpd" containerID="cri-o://7ecf4ec8c1ee1a7ace92b6c49a1bcdeee72baeaba961275535af959633d7d411" gracePeriod=30 Mar 18 15:58:25 crc kubenswrapper[4696]: I0318 15:58:25.035755 4696 generic.go:334] "Generic (PLEG): container finished" podID="76a6f156-f710-4c15-a20f-649b27d7e7d6" containerID="5e5ae1c6a7cbc0df617fc4b8a0ecc642e9ec06947dc3b86c1ea6e07b6bc466bf" exitCode=143 Mar 18 15:58:25 crc kubenswrapper[4696]: I0318 15:58:25.035833 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"76a6f156-f710-4c15-a20f-649b27d7e7d6","Type":"ContainerDied","Data":"5e5ae1c6a7cbc0df617fc4b8a0ecc642e9ec06947dc3b86c1ea6e07b6bc466bf"} Mar 18 15:58:25 crc kubenswrapper[4696]: I0318 15:58:25.039998 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e7be-account-create-update-v2pr7" Mar 18 15:58:25 crc kubenswrapper[4696]: I0318 15:58:25.040211 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e7be-account-create-update-v2pr7" event={"ID":"f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3","Type":"ContainerDied","Data":"26e05ce7c6abb1b7cfed0683690731ec3510860f88415b780a982ebcc2b87620"} Mar 18 15:58:25 crc kubenswrapper[4696]: I0318 15:58:25.040271 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26e05ce7c6abb1b7cfed0683690731ec3510860f88415b780a982ebcc2b87620" Mar 18 15:58:25 crc kubenswrapper[4696]: I0318 15:58:25.046831 4696 generic.go:334] "Generic (PLEG): container finished" podID="190416e8-e777-4ec5-b017-8b28d749252e" containerID="d658a29f107b38ab5ad3e6cc848018faada651586a9cdf0017614ef7dd0e2c30" exitCode=143 Mar 18 15:58:25 crc kubenswrapper[4696]: I0318 15:58:25.046891 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"190416e8-e777-4ec5-b017-8b28d749252e","Type":"ContainerDied","Data":"d658a29f107b38ab5ad3e6cc848018faada651586a9cdf0017614ef7dd0e2c30"} Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.011155 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6fd69d9b-g5mkf" Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.065014 4696 generic.go:334] "Generic (PLEG): container finished" podID="8a89cfa6-c720-40a1-aaa2-cfe62c153c14" containerID="1618e915cd969c52f18b76321c8af6b8917af2d7b0bdb02f3cf23c588c6c9e67" exitCode=0 Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.065074 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fd69d9b-g5mkf" event={"ID":"8a89cfa6-c720-40a1-aaa2-cfe62c153c14","Type":"ContainerDied","Data":"1618e915cd969c52f18b76321c8af6b8917af2d7b0bdb02f3cf23c588c6c9e67"} Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.065114 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fd69d9b-g5mkf" event={"ID":"8a89cfa6-c720-40a1-aaa2-cfe62c153c14","Type":"ContainerDied","Data":"86ba95dc63897e0d296c5f46330ce41ccaf6f201a25be51b0c5b50d5df98c356"} Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.065122 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6fd69d9b-g5mkf" Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.065139 4696 scope.go:117] "RemoveContainer" containerID="3ed8baf76939182337510c8296bf423be9f525c652717f15ced34e5b2a0098f6" Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.099996 4696 scope.go:117] "RemoveContainer" containerID="1618e915cd969c52f18b76321c8af6b8917af2d7b0bdb02f3cf23c588c6c9e67" Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.125203 4696 scope.go:117] "RemoveContainer" containerID="3ed8baf76939182337510c8296bf423be9f525c652717f15ced34e5b2a0098f6" Mar 18 15:58:26 crc kubenswrapper[4696]: E0318 15:58:26.126431 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ed8baf76939182337510c8296bf423be9f525c652717f15ced34e5b2a0098f6\": container with ID starting with 3ed8baf76939182337510c8296bf423be9f525c652717f15ced34e5b2a0098f6 not found: ID does not exist" containerID="3ed8baf76939182337510c8296bf423be9f525c652717f15ced34e5b2a0098f6" Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.126477 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ed8baf76939182337510c8296bf423be9f525c652717f15ced34e5b2a0098f6"} err="failed to get container status \"3ed8baf76939182337510c8296bf423be9f525c652717f15ced34e5b2a0098f6\": rpc error: code = NotFound desc = could not find container \"3ed8baf76939182337510c8296bf423be9f525c652717f15ced34e5b2a0098f6\": container with ID starting with 3ed8baf76939182337510c8296bf423be9f525c652717f15ced34e5b2a0098f6 not found: ID does not exist" Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.126534 4696 scope.go:117] "RemoveContainer" containerID="1618e915cd969c52f18b76321c8af6b8917af2d7b0bdb02f3cf23c588c6c9e67" Mar 18 15:58:26 crc kubenswrapper[4696]: E0318 15:58:26.126915 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1618e915cd969c52f18b76321c8af6b8917af2d7b0bdb02f3cf23c588c6c9e67\": container with ID starting with 1618e915cd969c52f18b76321c8af6b8917af2d7b0bdb02f3cf23c588c6c9e67 not found: ID does not exist" containerID="1618e915cd969c52f18b76321c8af6b8917af2d7b0bdb02f3cf23c588c6c9e67" Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.126963 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1618e915cd969c52f18b76321c8af6b8917af2d7b0bdb02f3cf23c588c6c9e67"} err="failed to get container status \"1618e915cd969c52f18b76321c8af6b8917af2d7b0bdb02f3cf23c588c6c9e67\": rpc error: code = NotFound desc = could not find container \"1618e915cd969c52f18b76321c8af6b8917af2d7b0bdb02f3cf23c588c6c9e67\": container with ID starting with 1618e915cd969c52f18b76321c8af6b8917af2d7b0bdb02f3cf23c588c6c9e67 not found: ID does not exist" Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.168617 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-ovndb-tls-certs\") pod \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\" (UID: \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\") " Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.168665 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-combined-ca-bundle\") pod \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\" (UID: \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\") " Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.168920 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kklfm\" (UniqueName: \"kubernetes.io/projected/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-kube-api-access-kklfm\") pod \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\" (UID: \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\") " Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.168974 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-httpd-config\") pod \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\" (UID: \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\") " Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.169161 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-config\") pod \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\" (UID: \"8a89cfa6-c720-40a1-aaa2-cfe62c153c14\") " Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.175681 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-kube-api-access-kklfm" (OuterVolumeSpecName: "kube-api-access-kklfm") pod "8a89cfa6-c720-40a1-aaa2-cfe62c153c14" (UID: "8a89cfa6-c720-40a1-aaa2-cfe62c153c14"). InnerVolumeSpecName "kube-api-access-kklfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.183000 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "8a89cfa6-c720-40a1-aaa2-cfe62c153c14" (UID: "8a89cfa6-c720-40a1-aaa2-cfe62c153c14"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.238693 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-config" (OuterVolumeSpecName: "config") pod "8a89cfa6-c720-40a1-aaa2-cfe62c153c14" (UID: "8a89cfa6-c720-40a1-aaa2-cfe62c153c14"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.239659 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a89cfa6-c720-40a1-aaa2-cfe62c153c14" (UID: "8a89cfa6-c720-40a1-aaa2-cfe62c153c14"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.286282 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.286650 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kklfm\" (UniqueName: \"kubernetes.io/projected/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-kube-api-access-kklfm\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.286674 4696 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.286686 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.288122 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "8a89cfa6-c720-40a1-aaa2-cfe62c153c14" (UID: "8a89cfa6-c720-40a1-aaa2-cfe62c153c14"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.389658 4696 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a89cfa6-c720-40a1-aaa2-cfe62c153c14-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.411722 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6fd69d9b-g5mkf"] Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.421974 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6fd69d9b-g5mkf"] Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.448961 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-696476876d-4rxz2" podUID="72021b21-00cf-4c33-be2d-b24f20dc0f9f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 18 15:58:26 crc kubenswrapper[4696]: I0318 15:58:26.449170 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.053102 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.057285 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6847f4969-jlnz4" Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.103829 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="76a6f156-f710-4c15-a20f-649b27d7e7d6" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.158:9292/healthcheck\": read tcp 10.217.0.2:59358->10.217.0.158:9292: read: connection reset by peer" Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.106026 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="76a6f156-f710-4c15-a20f-649b27d7e7d6" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.158:9292/healthcheck\": read tcp 10.217.0.2:59356->10.217.0.158:9292: read: connection reset by peer" Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.610603 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a89cfa6-c720-40a1-aaa2-cfe62c153c14" path="/var/lib/kubelet/pods/8a89cfa6-c720-40a1-aaa2-cfe62c153c14/volumes" Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.718163 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.830686 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76a6f156-f710-4c15-a20f-649b27d7e7d6-config-data\") pod \"76a6f156-f710-4c15-a20f-649b27d7e7d6\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.830783 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76a6f156-f710-4c15-a20f-649b27d7e7d6-logs\") pod \"76a6f156-f710-4c15-a20f-649b27d7e7d6\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.831018 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76a6f156-f710-4c15-a20f-649b27d7e7d6-scripts\") pod \"76a6f156-f710-4c15-a20f-649b27d7e7d6\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.831083 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcwnw\" (UniqueName: \"kubernetes.io/projected/76a6f156-f710-4c15-a20f-649b27d7e7d6-kube-api-access-fcwnw\") pod \"76a6f156-f710-4c15-a20f-649b27d7e7d6\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.831113 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/76a6f156-f710-4c15-a20f-649b27d7e7d6-httpd-run\") pod \"76a6f156-f710-4c15-a20f-649b27d7e7d6\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.831138 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76a6f156-f710-4c15-a20f-649b27d7e7d6-combined-ca-bundle\") pod \"76a6f156-f710-4c15-a20f-649b27d7e7d6\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.831308 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76a6f156-f710-4c15-a20f-649b27d7e7d6-public-tls-certs\") pod \"76a6f156-f710-4c15-a20f-649b27d7e7d6\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.831417 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"76a6f156-f710-4c15-a20f-649b27d7e7d6\" (UID: \"76a6f156-f710-4c15-a20f-649b27d7e7d6\") " Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.831494 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76a6f156-f710-4c15-a20f-649b27d7e7d6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "76a6f156-f710-4c15-a20f-649b27d7e7d6" (UID: "76a6f156-f710-4c15-a20f-649b27d7e7d6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.832372 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76a6f156-f710-4c15-a20f-649b27d7e7d6-logs" (OuterVolumeSpecName: "logs") pod "76a6f156-f710-4c15-a20f-649b27d7e7d6" (UID: "76a6f156-f710-4c15-a20f-649b27d7e7d6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.833121 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76a6f156-f710-4c15-a20f-649b27d7e7d6-logs\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.833144 4696 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/76a6f156-f710-4c15-a20f-649b27d7e7d6-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.838701 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "76a6f156-f710-4c15-a20f-649b27d7e7d6" (UID: "76a6f156-f710-4c15-a20f-649b27d7e7d6"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.839488 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76a6f156-f710-4c15-a20f-649b27d7e7d6-scripts" (OuterVolumeSpecName: "scripts") pod "76a6f156-f710-4c15-a20f-649b27d7e7d6" (UID: "76a6f156-f710-4c15-a20f-649b27d7e7d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.843175 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76a6f156-f710-4c15-a20f-649b27d7e7d6-kube-api-access-fcwnw" (OuterVolumeSpecName: "kube-api-access-fcwnw") pod "76a6f156-f710-4c15-a20f-649b27d7e7d6" (UID: "76a6f156-f710-4c15-a20f-649b27d7e7d6"). InnerVolumeSpecName "kube-api-access-fcwnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.876372 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76a6f156-f710-4c15-a20f-649b27d7e7d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76a6f156-f710-4c15-a20f-649b27d7e7d6" (UID: "76a6f156-f710-4c15-a20f-649b27d7e7d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.914091 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76a6f156-f710-4c15-a20f-649b27d7e7d6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "76a6f156-f710-4c15-a20f-649b27d7e7d6" (UID: "76a6f156-f710-4c15-a20f-649b27d7e7d6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.936495 4696 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.936547 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76a6f156-f710-4c15-a20f-649b27d7e7d6-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.936560 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcwnw\" (UniqueName: \"kubernetes.io/projected/76a6f156-f710-4c15-a20f-649b27d7e7d6-kube-api-access-fcwnw\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.936572 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76a6f156-f710-4c15-a20f-649b27d7e7d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.936581 4696 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76a6f156-f710-4c15-a20f-649b27d7e7d6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.945011 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76a6f156-f710-4c15-a20f-649b27d7e7d6-config-data" (OuterVolumeSpecName: "config-data") pod "76a6f156-f710-4c15-a20f-649b27d7e7d6" (UID: "76a6f156-f710-4c15-a20f-649b27d7e7d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:27 crc kubenswrapper[4696]: I0318 15:58:27.959567 4696 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.038661 4696 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.038733 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76a6f156-f710-4c15-a20f-649b27d7e7d6-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.096878 4696 generic.go:334] "Generic (PLEG): container finished" podID="76a6f156-f710-4c15-a20f-649b27d7e7d6" containerID="b6bfeef15251ff1e947a9ad69fb199e9f96dbcfadb87bc4e3a6bca3d0cce8f79" exitCode=0 Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.096945 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"76a6f156-f710-4c15-a20f-649b27d7e7d6","Type":"ContainerDied","Data":"b6bfeef15251ff1e947a9ad69fb199e9f96dbcfadb87bc4e3a6bca3d0cce8f79"} Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.097033 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"76a6f156-f710-4c15-a20f-649b27d7e7d6","Type":"ContainerDied","Data":"484aa7107b415d042c57704dc78e4e352214915b423c058e1d51255140fcd3d8"} Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.097059 4696 scope.go:117] "RemoveContainer" containerID="b6bfeef15251ff1e947a9ad69fb199e9f96dbcfadb87bc4e3a6bca3d0cce8f79" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.096970 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.118444 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="190416e8-e777-4ec5-b017-8b28d749252e" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.157:9292/healthcheck\": read tcp 10.217.0.2:34312->10.217.0.157:9292: read: connection reset by peer" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.118472 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="190416e8-e777-4ec5-b017-8b28d749252e" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.157:9292/healthcheck\": read tcp 10.217.0.2:34300->10.217.0.157:9292: read: connection reset by peer" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.133881 4696 scope.go:117] "RemoveContainer" containerID="5e5ae1c6a7cbc0df617fc4b8a0ecc642e9ec06947dc3b86c1ea6e07b6bc466bf" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.185601 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.206658 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.354777 4696 scope.go:117] "RemoveContainer" containerID="b6bfeef15251ff1e947a9ad69fb199e9f96dbcfadb87bc4e3a6bca3d0cce8f79" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.354935 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 15:58:28 crc kubenswrapper[4696]: E0318 15:58:28.355514 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a89cfa6-c720-40a1-aaa2-cfe62c153c14" containerName="neutron-httpd" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.355566 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a89cfa6-c720-40a1-aaa2-cfe62c153c14" containerName="neutron-httpd" Mar 18 15:58:28 crc kubenswrapper[4696]: E0318 15:58:28.355581 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3" containerName="mariadb-account-create-update" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.355589 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3" containerName="mariadb-account-create-update" Mar 18 15:58:28 crc kubenswrapper[4696]: E0318 15:58:28.355602 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a6f156-f710-4c15-a20f-649b27d7e7d6" containerName="glance-log" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.355608 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a6f156-f710-4c15-a20f-649b27d7e7d6" containerName="glance-log" Mar 18 15:58:28 crc kubenswrapper[4696]: E0318 15:58:28.355618 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a89cfa6-c720-40a1-aaa2-cfe62c153c14" containerName="neutron-api" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.355625 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a89cfa6-c720-40a1-aaa2-cfe62c153c14" containerName="neutron-api" Mar 18 15:58:28 crc kubenswrapper[4696]: E0318 15:58:28.355642 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1ce5a9d-83ad-459d-b149-25d0760e14e2" containerName="mariadb-database-create" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.355648 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1ce5a9d-83ad-459d-b149-25d0760e14e2" containerName="mariadb-database-create" Mar 18 15:58:28 crc kubenswrapper[4696]: E0318 15:58:28.355659 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a6f156-f710-4c15-a20f-649b27d7e7d6" containerName="glance-httpd" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.355665 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a6f156-f710-4c15-a20f-649b27d7e7d6" containerName="glance-httpd" Mar 18 15:58:28 crc kubenswrapper[4696]: E0318 15:58:28.355677 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="159d7aa7-449a-4e20-940c-f8c6f6fc88f0" containerName="mariadb-account-create-update" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.355711 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="159d7aa7-449a-4e20-940c-f8c6f6fc88f0" containerName="mariadb-account-create-update" Mar 18 15:58:28 crc kubenswrapper[4696]: E0318 15:58:28.355723 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b054f24-437a-4095-b1c4-c7e20b1e68d5" containerName="mariadb-account-create-update" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.355730 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b054f24-437a-4095-b1c4-c7e20b1e68d5" containerName="mariadb-account-create-update" Mar 18 15:58:28 crc kubenswrapper[4696]: E0318 15:58:28.355745 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="251f8d1d-fe68-479f-9af1-ec8c288b3524" containerName="mariadb-database-create" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.355751 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="251f8d1d-fe68-479f-9af1-ec8c288b3524" containerName="mariadb-database-create" Mar 18 15:58:28 crc kubenswrapper[4696]: E0318 15:58:28.355764 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a539865a-1121-432b-909b-617f805460d9" containerName="mariadb-database-create" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.355770 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a539865a-1121-432b-909b-617f805460d9" containerName="mariadb-database-create" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.355962 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b054f24-437a-4095-b1c4-c7e20b1e68d5" containerName="mariadb-account-create-update" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.355977 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a539865a-1121-432b-909b-617f805460d9" containerName="mariadb-database-create" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.355986 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1ce5a9d-83ad-459d-b149-25d0760e14e2" containerName="mariadb-database-create" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.355993 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="76a6f156-f710-4c15-a20f-649b27d7e7d6" containerName="glance-log" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.356004 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="159d7aa7-449a-4e20-940c-f8c6f6fc88f0" containerName="mariadb-account-create-update" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.356009 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3" containerName="mariadb-account-create-update" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.356042 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a89cfa6-c720-40a1-aaa2-cfe62c153c14" containerName="neutron-api" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.356053 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="76a6f156-f710-4c15-a20f-649b27d7e7d6" containerName="glance-httpd" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.356063 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a89cfa6-c720-40a1-aaa2-cfe62c153c14" containerName="neutron-httpd" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.356076 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="251f8d1d-fe68-479f-9af1-ec8c288b3524" containerName="mariadb-database-create" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.357789 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: E0318 15:58:28.357851 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6bfeef15251ff1e947a9ad69fb199e9f96dbcfadb87bc4e3a6bca3d0cce8f79\": container with ID starting with b6bfeef15251ff1e947a9ad69fb199e9f96dbcfadb87bc4e3a6bca3d0cce8f79 not found: ID does not exist" containerID="b6bfeef15251ff1e947a9ad69fb199e9f96dbcfadb87bc4e3a6bca3d0cce8f79" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.357889 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6bfeef15251ff1e947a9ad69fb199e9f96dbcfadb87bc4e3a6bca3d0cce8f79"} err="failed to get container status \"b6bfeef15251ff1e947a9ad69fb199e9f96dbcfadb87bc4e3a6bca3d0cce8f79\": rpc error: code = NotFound desc = could not find container \"b6bfeef15251ff1e947a9ad69fb199e9f96dbcfadb87bc4e3a6bca3d0cce8f79\": container with ID starting with b6bfeef15251ff1e947a9ad69fb199e9f96dbcfadb87bc4e3a6bca3d0cce8f79 not found: ID does not exist" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.357919 4696 scope.go:117] "RemoveContainer" containerID="5e5ae1c6a7cbc0df617fc4b8a0ecc642e9ec06947dc3b86c1ea6e07b6bc466bf" Mar 18 15:58:28 crc kubenswrapper[4696]: E0318 15:58:28.358572 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e5ae1c6a7cbc0df617fc4b8a0ecc642e9ec06947dc3b86c1ea6e07b6bc466bf\": container with ID starting with 5e5ae1c6a7cbc0df617fc4b8a0ecc642e9ec06947dc3b86c1ea6e07b6bc466bf not found: ID does not exist" containerID="5e5ae1c6a7cbc0df617fc4b8a0ecc642e9ec06947dc3b86c1ea6e07b6bc466bf" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.358611 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e5ae1c6a7cbc0df617fc4b8a0ecc642e9ec06947dc3b86c1ea6e07b6bc466bf"} err="failed to get container status \"5e5ae1c6a7cbc0df617fc4b8a0ecc642e9ec06947dc3b86c1ea6e07b6bc466bf\": rpc error: code = NotFound desc = could not find container \"5e5ae1c6a7cbc0df617fc4b8a0ecc642e9ec06947dc3b86c1ea6e07b6bc466bf\": container with ID starting with 5e5ae1c6a7cbc0df617fc4b8a0ecc642e9ec06947dc3b86c1ea6e07b6bc466bf not found: ID does not exist" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.361865 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.362287 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.376739 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.473977 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvzqq\" (UniqueName: \"kubernetes.io/projected/9351c230-91b7-40c0-afbc-8adad7604ad4-kube-api-access-cvzqq\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.474875 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9351c230-91b7-40c0-afbc-8adad7604ad4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.474974 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9351c230-91b7-40c0-afbc-8adad7604ad4-logs\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.475043 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.475232 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9351c230-91b7-40c0-afbc-8adad7604ad4-scripts\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.475458 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9351c230-91b7-40c0-afbc-8adad7604ad4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.475514 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9351c230-91b7-40c0-afbc-8adad7604ad4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.475548 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9351c230-91b7-40c0-afbc-8adad7604ad4-config-data\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.577608 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9351c230-91b7-40c0-afbc-8adad7604ad4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.577664 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9351c230-91b7-40c0-afbc-8adad7604ad4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.577682 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9351c230-91b7-40c0-afbc-8adad7604ad4-config-data\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.577749 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvzqq\" (UniqueName: \"kubernetes.io/projected/9351c230-91b7-40c0-afbc-8adad7604ad4-kube-api-access-cvzqq\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.577790 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9351c230-91b7-40c0-afbc-8adad7604ad4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.577820 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9351c230-91b7-40c0-afbc-8adad7604ad4-logs\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.577846 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.577890 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9351c230-91b7-40c0-afbc-8adad7604ad4-scripts\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.579905 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9351c230-91b7-40c0-afbc-8adad7604ad4-logs\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.580495 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9351c230-91b7-40c0-afbc-8adad7604ad4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.580650 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.583846 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9351c230-91b7-40c0-afbc-8adad7604ad4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.584596 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9351c230-91b7-40c0-afbc-8adad7604ad4-config-data\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.584976 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9351c230-91b7-40c0-afbc-8adad7604ad4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.584985 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9351c230-91b7-40c0-afbc-8adad7604ad4-scripts\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.598417 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvzqq\" (UniqueName: \"kubernetes.io/projected/9351c230-91b7-40c0-afbc-8adad7604ad4-kube-api-access-cvzqq\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.617246 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"9351c230-91b7-40c0-afbc-8adad7604ad4\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:28 crc kubenswrapper[4696]: I0318 15:58:28.688511 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.009894 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.092908 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"190416e8-e777-4ec5-b017-8b28d749252e\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.093033 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/190416e8-e777-4ec5-b017-8b28d749252e-config-data\") pod \"190416e8-e777-4ec5-b017-8b28d749252e\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.093063 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdntd\" (UniqueName: \"kubernetes.io/projected/190416e8-e777-4ec5-b017-8b28d749252e-kube-api-access-hdntd\") pod \"190416e8-e777-4ec5-b017-8b28d749252e\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.093301 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/190416e8-e777-4ec5-b017-8b28d749252e-combined-ca-bundle\") pod \"190416e8-e777-4ec5-b017-8b28d749252e\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.093418 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/190416e8-e777-4ec5-b017-8b28d749252e-httpd-run\") pod \"190416e8-e777-4ec5-b017-8b28d749252e\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.093486 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/190416e8-e777-4ec5-b017-8b28d749252e-internal-tls-certs\") pod \"190416e8-e777-4ec5-b017-8b28d749252e\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.093528 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/190416e8-e777-4ec5-b017-8b28d749252e-logs\") pod \"190416e8-e777-4ec5-b017-8b28d749252e\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.093572 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/190416e8-e777-4ec5-b017-8b28d749252e-scripts\") pod \"190416e8-e777-4ec5-b017-8b28d749252e\" (UID: \"190416e8-e777-4ec5-b017-8b28d749252e\") " Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.096078 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/190416e8-e777-4ec5-b017-8b28d749252e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "190416e8-e777-4ec5-b017-8b28d749252e" (UID: "190416e8-e777-4ec5-b017-8b28d749252e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.099563 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/190416e8-e777-4ec5-b017-8b28d749252e-logs" (OuterVolumeSpecName: "logs") pod "190416e8-e777-4ec5-b017-8b28d749252e" (UID: "190416e8-e777-4ec5-b017-8b28d749252e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.104708 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "190416e8-e777-4ec5-b017-8b28d749252e" (UID: "190416e8-e777-4ec5-b017-8b28d749252e"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.105146 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/190416e8-e777-4ec5-b017-8b28d749252e-kube-api-access-hdntd" (OuterVolumeSpecName: "kube-api-access-hdntd") pod "190416e8-e777-4ec5-b017-8b28d749252e" (UID: "190416e8-e777-4ec5-b017-8b28d749252e"). InnerVolumeSpecName "kube-api-access-hdntd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.113284 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/190416e8-e777-4ec5-b017-8b28d749252e-scripts" (OuterVolumeSpecName: "scripts") pod "190416e8-e777-4ec5-b017-8b28d749252e" (UID: "190416e8-e777-4ec5-b017-8b28d749252e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.119716 4696 generic.go:334] "Generic (PLEG): container finished" podID="190416e8-e777-4ec5-b017-8b28d749252e" containerID="7ecf4ec8c1ee1a7ace92b6c49a1bcdeee72baeaba961275535af959633d7d411" exitCode=0 Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.119771 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"190416e8-e777-4ec5-b017-8b28d749252e","Type":"ContainerDied","Data":"7ecf4ec8c1ee1a7ace92b6c49a1bcdeee72baeaba961275535af959633d7d411"} Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.119809 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"190416e8-e777-4ec5-b017-8b28d749252e","Type":"ContainerDied","Data":"3d04bd896c066070599c5d03286bb0dea2a3217879ef90a04f5956fc7d6c0579"} Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.119831 4696 scope.go:117] "RemoveContainer" containerID="7ecf4ec8c1ee1a7ace92b6c49a1bcdeee72baeaba961275535af959633d7d411" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.119833 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.148493 4696 scope.go:117] "RemoveContainer" containerID="d658a29f107b38ab5ad3e6cc848018faada651586a9cdf0017614ef7dd0e2c30" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.163282 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/190416e8-e777-4ec5-b017-8b28d749252e-config-data" (OuterVolumeSpecName: "config-data") pod "190416e8-e777-4ec5-b017-8b28d749252e" (UID: "190416e8-e777-4ec5-b017-8b28d749252e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.168813 4696 scope.go:117] "RemoveContainer" containerID="7ecf4ec8c1ee1a7ace92b6c49a1bcdeee72baeaba961275535af959633d7d411" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.171607 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/190416e8-e777-4ec5-b017-8b28d749252e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "190416e8-e777-4ec5-b017-8b28d749252e" (UID: "190416e8-e777-4ec5-b017-8b28d749252e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:29 crc kubenswrapper[4696]: E0318 15:58:29.172410 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ecf4ec8c1ee1a7ace92b6c49a1bcdeee72baeaba961275535af959633d7d411\": container with ID starting with 7ecf4ec8c1ee1a7ace92b6c49a1bcdeee72baeaba961275535af959633d7d411 not found: ID does not exist" containerID="7ecf4ec8c1ee1a7ace92b6c49a1bcdeee72baeaba961275535af959633d7d411" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.172459 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ecf4ec8c1ee1a7ace92b6c49a1bcdeee72baeaba961275535af959633d7d411"} err="failed to get container status \"7ecf4ec8c1ee1a7ace92b6c49a1bcdeee72baeaba961275535af959633d7d411\": rpc error: code = NotFound desc = could not find container \"7ecf4ec8c1ee1a7ace92b6c49a1bcdeee72baeaba961275535af959633d7d411\": container with ID starting with 7ecf4ec8c1ee1a7ace92b6c49a1bcdeee72baeaba961275535af959633d7d411 not found: ID does not exist" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.172502 4696 scope.go:117] "RemoveContainer" containerID="d658a29f107b38ab5ad3e6cc848018faada651586a9cdf0017614ef7dd0e2c30" Mar 18 15:58:29 crc kubenswrapper[4696]: E0318 15:58:29.173101 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d658a29f107b38ab5ad3e6cc848018faada651586a9cdf0017614ef7dd0e2c30\": container with ID starting with d658a29f107b38ab5ad3e6cc848018faada651586a9cdf0017614ef7dd0e2c30 not found: ID does not exist" containerID="d658a29f107b38ab5ad3e6cc848018faada651586a9cdf0017614ef7dd0e2c30" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.173133 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d658a29f107b38ab5ad3e6cc848018faada651586a9cdf0017614ef7dd0e2c30"} err="failed to get container status \"d658a29f107b38ab5ad3e6cc848018faada651586a9cdf0017614ef7dd0e2c30\": rpc error: code = NotFound desc = could not find container \"d658a29f107b38ab5ad3e6cc848018faada651586a9cdf0017614ef7dd0e2c30\": container with ID starting with d658a29f107b38ab5ad3e6cc848018faada651586a9cdf0017614ef7dd0e2c30 not found: ID does not exist" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.193651 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/190416e8-e777-4ec5-b017-8b28d749252e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "190416e8-e777-4ec5-b017-8b28d749252e" (UID: "190416e8-e777-4ec5-b017-8b28d749252e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.196314 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/190416e8-e777-4ec5-b017-8b28d749252e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.196336 4696 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/190416e8-e777-4ec5-b017-8b28d749252e-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.196345 4696 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/190416e8-e777-4ec5-b017-8b28d749252e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.196354 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/190416e8-e777-4ec5-b017-8b28d749252e-logs\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.196362 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/190416e8-e777-4ec5-b017-8b28d749252e-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.196400 4696 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.196409 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/190416e8-e777-4ec5-b017-8b28d749252e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.196417 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdntd\" (UniqueName: \"kubernetes.io/projected/190416e8-e777-4ec5-b017-8b28d749252e-kube-api-access-hdntd\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.199823 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 15:58:29 crc kubenswrapper[4696]: W0318 15:58:29.209173 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9351c230_91b7_40c0_afbc_8adad7604ad4.slice/crio-3903eaa36589635f15519658392b066d463a8430e46e07d36c49ef5ceda2879b WatchSource:0}: Error finding container 3903eaa36589635f15519658392b066d463a8430e46e07d36c49ef5ceda2879b: Status 404 returned error can't find the container with id 3903eaa36589635f15519658392b066d463a8430e46e07d36c49ef5ceda2879b Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.232274 4696 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.298843 4696 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.476707 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.511736 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.522756 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 15:58:29 crc kubenswrapper[4696]: E0318 15:58:29.523356 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="190416e8-e777-4ec5-b017-8b28d749252e" containerName="glance-log" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.523371 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="190416e8-e777-4ec5-b017-8b28d749252e" containerName="glance-log" Mar 18 15:58:29 crc kubenswrapper[4696]: E0318 15:58:29.523396 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="190416e8-e777-4ec5-b017-8b28d749252e" containerName="glance-httpd" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.523402 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="190416e8-e777-4ec5-b017-8b28d749252e" containerName="glance-httpd" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.523615 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="190416e8-e777-4ec5-b017-8b28d749252e" containerName="glance-log" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.523632 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="190416e8-e777-4ec5-b017-8b28d749252e" containerName="glance-httpd" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.524800 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.528397 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.528739 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.532648 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.606062 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f1e526e-e856-452e-8fc6-26663ca20e4a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.606136 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f1e526e-e856-452e-8fc6-26663ca20e4a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.606170 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f1e526e-e856-452e-8fc6-26663ca20e4a-logs\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.606188 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f1e526e-e856-452e-8fc6-26663ca20e4a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.606234 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ggbm\" (UniqueName: \"kubernetes.io/projected/1f1e526e-e856-452e-8fc6-26663ca20e4a-kube-api-access-8ggbm\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.606310 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1f1e526e-e856-452e-8fc6-26663ca20e4a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.606339 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.606374 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f1e526e-e856-452e-8fc6-26663ca20e4a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.613718 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="190416e8-e777-4ec5-b017-8b28d749252e" path="/var/lib/kubelet/pods/190416e8-e777-4ec5-b017-8b28d749252e/volumes" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.614618 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76a6f156-f710-4c15-a20f-649b27d7e7d6" path="/var/lib/kubelet/pods/76a6f156-f710-4c15-a20f-649b27d7e7d6/volumes" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.709121 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1f1e526e-e856-452e-8fc6-26663ca20e4a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.709456 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.709598 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f1e526e-e856-452e-8fc6-26663ca20e4a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.709736 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f1e526e-e856-452e-8fc6-26663ca20e4a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.709866 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f1e526e-e856-452e-8fc6-26663ca20e4a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.710035 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f1e526e-e856-452e-8fc6-26663ca20e4a-logs\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.709962 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.710135 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f1e526e-e856-452e-8fc6-26663ca20e4a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.710479 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ggbm\" (UniqueName: \"kubernetes.io/projected/1f1e526e-e856-452e-8fc6-26663ca20e4a-kube-api-access-8ggbm\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.710903 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1f1e526e-e856-452e-8fc6-26663ca20e4a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.711672 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f1e526e-e856-452e-8fc6-26663ca20e4a-logs\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.720439 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f1e526e-e856-452e-8fc6-26663ca20e4a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.720623 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f1e526e-e856-452e-8fc6-26663ca20e4a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.733679 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f1e526e-e856-452e-8fc6-26663ca20e4a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.735732 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f1e526e-e856-452e-8fc6-26663ca20e4a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.739108 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ggbm\" (UniqueName: \"kubernetes.io/projected/1f1e526e-e856-452e-8fc6-26663ca20e4a-kube-api-access-8ggbm\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.776138 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"1f1e526e-e856-452e-8fc6-26663ca20e4a\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.842994 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.945492 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-b9zfd"] Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.947157 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-b9zfd" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.950944 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.952444 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gxncm" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.952571 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 18 15:58:29 crc kubenswrapper[4696]: I0318 15:58:29.961971 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-b9zfd"] Mar 18 15:58:30 crc kubenswrapper[4696]: I0318 15:58:30.017599 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02c1777e-d983-4003-a570-ce8c867cb635-config-data\") pod \"nova-cell0-conductor-db-sync-b9zfd\" (UID: \"02c1777e-d983-4003-a570-ce8c867cb635\") " pod="openstack/nova-cell0-conductor-db-sync-b9zfd" Mar 18 15:58:30 crc kubenswrapper[4696]: I0318 15:58:30.017712 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02c1777e-d983-4003-a570-ce8c867cb635-scripts\") pod \"nova-cell0-conductor-db-sync-b9zfd\" (UID: \"02c1777e-d983-4003-a570-ce8c867cb635\") " pod="openstack/nova-cell0-conductor-db-sync-b9zfd" Mar 18 15:58:30 crc kubenswrapper[4696]: I0318 15:58:30.017771 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c1777e-d983-4003-a570-ce8c867cb635-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-b9zfd\" (UID: \"02c1777e-d983-4003-a570-ce8c867cb635\") " pod="openstack/nova-cell0-conductor-db-sync-b9zfd" Mar 18 15:58:30 crc kubenswrapper[4696]: I0318 15:58:30.017829 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xcqn\" (UniqueName: \"kubernetes.io/projected/02c1777e-d983-4003-a570-ce8c867cb635-kube-api-access-9xcqn\") pod \"nova-cell0-conductor-db-sync-b9zfd\" (UID: \"02c1777e-d983-4003-a570-ce8c867cb635\") " pod="openstack/nova-cell0-conductor-db-sync-b9zfd" Mar 18 15:58:30 crc kubenswrapper[4696]: I0318 15:58:30.121597 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02c1777e-d983-4003-a570-ce8c867cb635-config-data\") pod \"nova-cell0-conductor-db-sync-b9zfd\" (UID: \"02c1777e-d983-4003-a570-ce8c867cb635\") " pod="openstack/nova-cell0-conductor-db-sync-b9zfd" Mar 18 15:58:30 crc kubenswrapper[4696]: I0318 15:58:30.121927 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02c1777e-d983-4003-a570-ce8c867cb635-scripts\") pod \"nova-cell0-conductor-db-sync-b9zfd\" (UID: \"02c1777e-d983-4003-a570-ce8c867cb635\") " pod="openstack/nova-cell0-conductor-db-sync-b9zfd" Mar 18 15:58:30 crc kubenswrapper[4696]: I0318 15:58:30.122125 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c1777e-d983-4003-a570-ce8c867cb635-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-b9zfd\" (UID: \"02c1777e-d983-4003-a570-ce8c867cb635\") " pod="openstack/nova-cell0-conductor-db-sync-b9zfd" Mar 18 15:58:30 crc kubenswrapper[4696]: I0318 15:58:30.122167 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xcqn\" (UniqueName: \"kubernetes.io/projected/02c1777e-d983-4003-a570-ce8c867cb635-kube-api-access-9xcqn\") pod \"nova-cell0-conductor-db-sync-b9zfd\" (UID: \"02c1777e-d983-4003-a570-ce8c867cb635\") " pod="openstack/nova-cell0-conductor-db-sync-b9zfd" Mar 18 15:58:30 crc kubenswrapper[4696]: I0318 15:58:30.127822 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02c1777e-d983-4003-a570-ce8c867cb635-config-data\") pod \"nova-cell0-conductor-db-sync-b9zfd\" (UID: \"02c1777e-d983-4003-a570-ce8c867cb635\") " pod="openstack/nova-cell0-conductor-db-sync-b9zfd" Mar 18 15:58:30 crc kubenswrapper[4696]: I0318 15:58:30.134250 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02c1777e-d983-4003-a570-ce8c867cb635-scripts\") pod \"nova-cell0-conductor-db-sync-b9zfd\" (UID: \"02c1777e-d983-4003-a570-ce8c867cb635\") " pod="openstack/nova-cell0-conductor-db-sync-b9zfd" Mar 18 15:58:30 crc kubenswrapper[4696]: I0318 15:58:30.130571 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c1777e-d983-4003-a570-ce8c867cb635-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-b9zfd\" (UID: \"02c1777e-d983-4003-a570-ce8c867cb635\") " pod="openstack/nova-cell0-conductor-db-sync-b9zfd" Mar 18 15:58:30 crc kubenswrapper[4696]: I0318 15:58:30.145940 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9351c230-91b7-40c0-afbc-8adad7604ad4","Type":"ContainerStarted","Data":"5a9410b4df6330f4231609267523abe22c9a2a632552414ab2032cdb40085fed"} Mar 18 15:58:30 crc kubenswrapper[4696]: I0318 15:58:30.146032 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9351c230-91b7-40c0-afbc-8adad7604ad4","Type":"ContainerStarted","Data":"3903eaa36589635f15519658392b066d463a8430e46e07d36c49ef5ceda2879b"} Mar 18 15:58:30 crc kubenswrapper[4696]: I0318 15:58:30.163420 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xcqn\" (UniqueName: \"kubernetes.io/projected/02c1777e-d983-4003-a570-ce8c867cb635-kube-api-access-9xcqn\") pod \"nova-cell0-conductor-db-sync-b9zfd\" (UID: \"02c1777e-d983-4003-a570-ce8c867cb635\") " pod="openstack/nova-cell0-conductor-db-sync-b9zfd" Mar 18 15:58:30 crc kubenswrapper[4696]: I0318 15:58:30.282974 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-b9zfd" Mar 18 15:58:30 crc kubenswrapper[4696]: I0318 15:58:30.518653 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 15:58:30 crc kubenswrapper[4696]: I0318 15:58:30.814724 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-b9zfd"] Mar 18 15:58:30 crc kubenswrapper[4696]: W0318 15:58:30.826773 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02c1777e_d983_4003_a570_ce8c867cb635.slice/crio-504428eb90aeeb7d3170b9180b5d88a6facd719c54057da59782c627b01a330d WatchSource:0}: Error finding container 504428eb90aeeb7d3170b9180b5d88a6facd719c54057da59782c627b01a330d: Status 404 returned error can't find the container with id 504428eb90aeeb7d3170b9180b5d88a6facd719c54057da59782c627b01a330d Mar 18 15:58:31 crc kubenswrapper[4696]: I0318 15:58:31.162840 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1f1e526e-e856-452e-8fc6-26663ca20e4a","Type":"ContainerStarted","Data":"c18072c599a7f676c463d081127ceef037652e604f30b6f0ee09755f6d603e40"} Mar 18 15:58:31 crc kubenswrapper[4696]: I0318 15:58:31.165450 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-b9zfd" event={"ID":"02c1777e-d983-4003-a570-ce8c867cb635","Type":"ContainerStarted","Data":"504428eb90aeeb7d3170b9180b5d88a6facd719c54057da59782c627b01a330d"} Mar 18 15:58:31 crc kubenswrapper[4696]: I0318 15:58:31.167725 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9351c230-91b7-40c0-afbc-8adad7604ad4","Type":"ContainerStarted","Data":"e92a5b5b66eb1427a292e52161f886b172709560f33b70357f9fe7fbefa485d8"} Mar 18 15:58:31 crc kubenswrapper[4696]: I0318 15:58:31.198104 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.19808002 podStartE2EDuration="3.19808002s" podCreationTimestamp="2026-03-18 15:58:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:58:31.187259999 +0000 UTC m=+1354.193434205" watchObservedRunningTime="2026-03-18 15:58:31.19808002 +0000 UTC m=+1354.204254226" Mar 18 15:58:31 crc kubenswrapper[4696]: I0318 15:58:31.767608 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:58:31 crc kubenswrapper[4696]: I0318 15:58:31.878099 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72021b21-00cf-4c33-be2d-b24f20dc0f9f-combined-ca-bundle\") pod \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " Mar 18 15:58:31 crc kubenswrapper[4696]: I0318 15:58:31.878199 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72021b21-00cf-4c33-be2d-b24f20dc0f9f-config-data\") pod \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " Mar 18 15:58:31 crc kubenswrapper[4696]: I0318 15:58:31.878257 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72021b21-00cf-4c33-be2d-b24f20dc0f9f-logs\") pod \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " Mar 18 15:58:31 crc kubenswrapper[4696]: I0318 15:58:31.878301 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/72021b21-00cf-4c33-be2d-b24f20dc0f9f-horizon-secret-key\") pod \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " Mar 18 15:58:31 crc kubenswrapper[4696]: I0318 15:58:31.878416 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72021b21-00cf-4c33-be2d-b24f20dc0f9f-scripts\") pod \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " Mar 18 15:58:31 crc kubenswrapper[4696]: I0318 15:58:31.878557 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr44m\" (UniqueName: \"kubernetes.io/projected/72021b21-00cf-4c33-be2d-b24f20dc0f9f-kube-api-access-qr44m\") pod \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " Mar 18 15:58:31 crc kubenswrapper[4696]: I0318 15:58:31.878666 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/72021b21-00cf-4c33-be2d-b24f20dc0f9f-horizon-tls-certs\") pod \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\" (UID: \"72021b21-00cf-4c33-be2d-b24f20dc0f9f\") " Mar 18 15:58:31 crc kubenswrapper[4696]: I0318 15:58:31.880028 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72021b21-00cf-4c33-be2d-b24f20dc0f9f-logs" (OuterVolumeSpecName: "logs") pod "72021b21-00cf-4c33-be2d-b24f20dc0f9f" (UID: "72021b21-00cf-4c33-be2d-b24f20dc0f9f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:58:31 crc kubenswrapper[4696]: I0318 15:58:31.891201 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72021b21-00cf-4c33-be2d-b24f20dc0f9f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "72021b21-00cf-4c33-be2d-b24f20dc0f9f" (UID: "72021b21-00cf-4c33-be2d-b24f20dc0f9f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:31 crc kubenswrapper[4696]: I0318 15:58:31.893415 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72021b21-00cf-4c33-be2d-b24f20dc0f9f-kube-api-access-qr44m" (OuterVolumeSpecName: "kube-api-access-qr44m") pod "72021b21-00cf-4c33-be2d-b24f20dc0f9f" (UID: "72021b21-00cf-4c33-be2d-b24f20dc0f9f"). InnerVolumeSpecName "kube-api-access-qr44m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:31 crc kubenswrapper[4696]: I0318 15:58:31.916588 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72021b21-00cf-4c33-be2d-b24f20dc0f9f-scripts" (OuterVolumeSpecName: "scripts") pod "72021b21-00cf-4c33-be2d-b24f20dc0f9f" (UID: "72021b21-00cf-4c33-be2d-b24f20dc0f9f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:31 crc kubenswrapper[4696]: I0318 15:58:31.921771 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72021b21-00cf-4c33-be2d-b24f20dc0f9f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72021b21-00cf-4c33-be2d-b24f20dc0f9f" (UID: "72021b21-00cf-4c33-be2d-b24f20dc0f9f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:31 crc kubenswrapper[4696]: I0318 15:58:31.981551 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72021b21-00cf-4c33-be2d-b24f20dc0f9f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:31 crc kubenswrapper[4696]: I0318 15:58:31.981886 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72021b21-00cf-4c33-be2d-b24f20dc0f9f-logs\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:31 crc kubenswrapper[4696]: I0318 15:58:31.981958 4696 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/72021b21-00cf-4c33-be2d-b24f20dc0f9f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:31 crc kubenswrapper[4696]: I0318 15:58:31.982017 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72021b21-00cf-4c33-be2d-b24f20dc0f9f-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:31 crc kubenswrapper[4696]: I0318 15:58:31.982117 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr44m\" (UniqueName: \"kubernetes.io/projected/72021b21-00cf-4c33-be2d-b24f20dc0f9f-kube-api-access-qr44m\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:32 crc kubenswrapper[4696]: I0318 15:58:32.001111 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72021b21-00cf-4c33-be2d-b24f20dc0f9f-config-data" (OuterVolumeSpecName: "config-data") pod "72021b21-00cf-4c33-be2d-b24f20dc0f9f" (UID: "72021b21-00cf-4c33-be2d-b24f20dc0f9f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:32 crc kubenswrapper[4696]: I0318 15:58:32.019093 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72021b21-00cf-4c33-be2d-b24f20dc0f9f-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "72021b21-00cf-4c33-be2d-b24f20dc0f9f" (UID: "72021b21-00cf-4c33-be2d-b24f20dc0f9f"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:32 crc kubenswrapper[4696]: I0318 15:58:32.084600 4696 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/72021b21-00cf-4c33-be2d-b24f20dc0f9f-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:32 crc kubenswrapper[4696]: I0318 15:58:32.084651 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72021b21-00cf-4c33-be2d-b24f20dc0f9f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:32 crc kubenswrapper[4696]: I0318 15:58:32.231180 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1f1e526e-e856-452e-8fc6-26663ca20e4a","Type":"ContainerStarted","Data":"1d6d2e2c5bdab88d7596d63d9f9df632b3d03a2da0f08486ac3a56f860ae600e"} Mar 18 15:58:32 crc kubenswrapper[4696]: I0318 15:58:32.231252 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1f1e526e-e856-452e-8fc6-26663ca20e4a","Type":"ContainerStarted","Data":"c1c7da9ddf278fb7c2d098cb1e7085bdeaee6297cf61c64d8309f86525d99562"} Mar 18 15:58:32 crc kubenswrapper[4696]: I0318 15:58:32.247239 4696 generic.go:334] "Generic (PLEG): container finished" podID="72021b21-00cf-4c33-be2d-b24f20dc0f9f" containerID="edfbd6cd9033e70288358ec7f95bf3e894b8c09d6b8a9796e6f6a76bce599b5d" exitCode=137 Mar 18 15:58:32 crc kubenswrapper[4696]: I0318 15:58:32.247343 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-696476876d-4rxz2" event={"ID":"72021b21-00cf-4c33-be2d-b24f20dc0f9f","Type":"ContainerDied","Data":"edfbd6cd9033e70288358ec7f95bf3e894b8c09d6b8a9796e6f6a76bce599b5d"} Mar 18 15:58:32 crc kubenswrapper[4696]: I0318 15:58:32.247361 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-696476876d-4rxz2" Mar 18 15:58:32 crc kubenswrapper[4696]: I0318 15:58:32.247417 4696 scope.go:117] "RemoveContainer" containerID="aa6ce0736d260c0e890babe3e97927ddf6d6dc7f76b06938d567fa9e56ca3420" Mar 18 15:58:32 crc kubenswrapper[4696]: I0318 15:58:32.247399 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-696476876d-4rxz2" event={"ID":"72021b21-00cf-4c33-be2d-b24f20dc0f9f","Type":"ContainerDied","Data":"d4ca6da5c146bca7f3b27142eac4ced4392eec6952db9bb6b4a57aac46b40ec9"} Mar 18 15:58:32 crc kubenswrapper[4696]: I0318 15:58:32.271434 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.271410225 podStartE2EDuration="3.271410225s" podCreationTimestamp="2026-03-18 15:58:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:58:32.26521765 +0000 UTC m=+1355.271391856" watchObservedRunningTime="2026-03-18 15:58:32.271410225 +0000 UTC m=+1355.277584431" Mar 18 15:58:32 crc kubenswrapper[4696]: I0318 15:58:32.328761 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-696476876d-4rxz2"] Mar 18 15:58:32 crc kubenswrapper[4696]: I0318 15:58:32.342548 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-696476876d-4rxz2"] Mar 18 15:58:32 crc kubenswrapper[4696]: I0318 15:58:32.474445 4696 scope.go:117] "RemoveContainer" containerID="edfbd6cd9033e70288358ec7f95bf3e894b8c09d6b8a9796e6f6a76bce599b5d" Mar 18 15:58:32 crc kubenswrapper[4696]: I0318 15:58:32.529247 4696 scope.go:117] "RemoveContainer" containerID="aa6ce0736d260c0e890babe3e97927ddf6d6dc7f76b06938d567fa9e56ca3420" Mar 18 15:58:32 crc kubenswrapper[4696]: E0318 15:58:32.530294 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa6ce0736d260c0e890babe3e97927ddf6d6dc7f76b06938d567fa9e56ca3420\": container with ID starting with aa6ce0736d260c0e890babe3e97927ddf6d6dc7f76b06938d567fa9e56ca3420 not found: ID does not exist" containerID="aa6ce0736d260c0e890babe3e97927ddf6d6dc7f76b06938d567fa9e56ca3420" Mar 18 15:58:32 crc kubenswrapper[4696]: I0318 15:58:32.530353 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa6ce0736d260c0e890babe3e97927ddf6d6dc7f76b06938d567fa9e56ca3420"} err="failed to get container status \"aa6ce0736d260c0e890babe3e97927ddf6d6dc7f76b06938d567fa9e56ca3420\": rpc error: code = NotFound desc = could not find container \"aa6ce0736d260c0e890babe3e97927ddf6d6dc7f76b06938d567fa9e56ca3420\": container with ID starting with aa6ce0736d260c0e890babe3e97927ddf6d6dc7f76b06938d567fa9e56ca3420 not found: ID does not exist" Mar 18 15:58:32 crc kubenswrapper[4696]: I0318 15:58:32.530392 4696 scope.go:117] "RemoveContainer" containerID="edfbd6cd9033e70288358ec7f95bf3e894b8c09d6b8a9796e6f6a76bce599b5d" Mar 18 15:58:32 crc kubenswrapper[4696]: E0318 15:58:32.530811 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edfbd6cd9033e70288358ec7f95bf3e894b8c09d6b8a9796e6f6a76bce599b5d\": container with ID starting with edfbd6cd9033e70288358ec7f95bf3e894b8c09d6b8a9796e6f6a76bce599b5d not found: ID does not exist" containerID="edfbd6cd9033e70288358ec7f95bf3e894b8c09d6b8a9796e6f6a76bce599b5d" Mar 18 15:58:32 crc kubenswrapper[4696]: I0318 15:58:32.530865 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edfbd6cd9033e70288358ec7f95bf3e894b8c09d6b8a9796e6f6a76bce599b5d"} err="failed to get container status \"edfbd6cd9033e70288358ec7f95bf3e894b8c09d6b8a9796e6f6a76bce599b5d\": rpc error: code = NotFound desc = could not find container \"edfbd6cd9033e70288358ec7f95bf3e894b8c09d6b8a9796e6f6a76bce599b5d\": container with ID starting with edfbd6cd9033e70288358ec7f95bf3e894b8c09d6b8a9796e6f6a76bce599b5d not found: ID does not exist" Mar 18 15:58:33 crc kubenswrapper[4696]: I0318 15:58:33.612404 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72021b21-00cf-4c33-be2d-b24f20dc0f9f" path="/var/lib/kubelet/pods/72021b21-00cf-4c33-be2d-b24f20dc0f9f/volumes" Mar 18 15:58:37 crc kubenswrapper[4696]: I0318 15:58:37.269146 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="4c585254-06f5-4cf1-bc73-27d3b04795f0" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 15:58:38 crc kubenswrapper[4696]: I0318 15:58:38.689402 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 15:58:38 crc kubenswrapper[4696]: I0318 15:58:38.689837 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 15:58:38 crc kubenswrapper[4696]: I0318 15:58:38.720637 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 15:58:38 crc kubenswrapper[4696]: I0318 15:58:38.728984 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 15:58:39 crc kubenswrapper[4696]: I0318 15:58:39.339639 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-b9zfd" event={"ID":"02c1777e-d983-4003-a570-ce8c867cb635","Type":"ContainerStarted","Data":"2ba0a95e059696a948f85ffaffff7aa7263f9f6928a81c9152a6782f7a8cd566"} Mar 18 15:58:39 crc kubenswrapper[4696]: I0318 15:58:39.340111 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 15:58:39 crc kubenswrapper[4696]: I0318 15:58:39.340137 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 15:58:39 crc kubenswrapper[4696]: I0318 15:58:39.362075 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-b9zfd" podStartSLOduration=2.159207267 podStartE2EDuration="10.362047594s" podCreationTimestamp="2026-03-18 15:58:29 +0000 UTC" firstStartedPulling="2026-03-18 15:58:30.831084785 +0000 UTC m=+1353.837258991" lastFinishedPulling="2026-03-18 15:58:39.033925112 +0000 UTC m=+1362.040099318" observedRunningTime="2026-03-18 15:58:39.356927065 +0000 UTC m=+1362.363101281" watchObservedRunningTime="2026-03-18 15:58:39.362047594 +0000 UTC m=+1362.368221810" Mar 18 15:58:39 crc kubenswrapper[4696]: I0318 15:58:39.843490 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:39 crc kubenswrapper[4696]: I0318 15:58:39.843585 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:39 crc kubenswrapper[4696]: I0318 15:58:39.876282 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:39 crc kubenswrapper[4696]: I0318 15:58:39.892998 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:40 crc kubenswrapper[4696]: I0318 15:58:40.348837 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:40 crc kubenswrapper[4696]: I0318 15:58:40.349244 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:41 crc kubenswrapper[4696]: I0318 15:58:41.445184 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 15:58:41 crc kubenswrapper[4696]: I0318 15:58:41.445354 4696 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 15:58:41 crc kubenswrapper[4696]: I0318 15:58:41.451350 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 15:58:42 crc kubenswrapper[4696]: I0318 15:58:42.185266 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:58:42 crc kubenswrapper[4696]: I0318 15:58:42.185731 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:58:42 crc kubenswrapper[4696]: I0318 15:58:42.367107 4696 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 15:58:42 crc kubenswrapper[4696]: I0318 15:58:42.367141 4696 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 15:58:42 crc kubenswrapper[4696]: I0318 15:58:42.819014 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:42 crc kubenswrapper[4696]: I0318 15:58:42.825660 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.242244 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.303470 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-config-data\") pod \"4c585254-06f5-4cf1-bc73-27d3b04795f0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.303670 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c585254-06f5-4cf1-bc73-27d3b04795f0-run-httpd\") pod \"4c585254-06f5-4cf1-bc73-27d3b04795f0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.303859 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-scripts\") pod \"4c585254-06f5-4cf1-bc73-27d3b04795f0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.303941 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c585254-06f5-4cf1-bc73-27d3b04795f0-log-httpd\") pod \"4c585254-06f5-4cf1-bc73-27d3b04795f0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.303987 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkwwx\" (UniqueName: \"kubernetes.io/projected/4c585254-06f5-4cf1-bc73-27d3b04795f0-kube-api-access-tkwwx\") pod \"4c585254-06f5-4cf1-bc73-27d3b04795f0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.304046 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-combined-ca-bundle\") pod \"4c585254-06f5-4cf1-bc73-27d3b04795f0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.304074 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-sg-core-conf-yaml\") pod \"4c585254-06f5-4cf1-bc73-27d3b04795f0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.304436 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c585254-06f5-4cf1-bc73-27d3b04795f0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4c585254-06f5-4cf1-bc73-27d3b04795f0" (UID: "4c585254-06f5-4cf1-bc73-27d3b04795f0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.304565 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c585254-06f5-4cf1-bc73-27d3b04795f0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4c585254-06f5-4cf1-bc73-27d3b04795f0" (UID: "4c585254-06f5-4cf1-bc73-27d3b04795f0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.305774 4696 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c585254-06f5-4cf1-bc73-27d3b04795f0-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.305801 4696 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c585254-06f5-4cf1-bc73-27d3b04795f0-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.310275 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-scripts" (OuterVolumeSpecName: "scripts") pod "4c585254-06f5-4cf1-bc73-27d3b04795f0" (UID: "4c585254-06f5-4cf1-bc73-27d3b04795f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.327693 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c585254-06f5-4cf1-bc73-27d3b04795f0-kube-api-access-tkwwx" (OuterVolumeSpecName: "kube-api-access-tkwwx") pod "4c585254-06f5-4cf1-bc73-27d3b04795f0" (UID: "4c585254-06f5-4cf1-bc73-27d3b04795f0"). InnerVolumeSpecName "kube-api-access-tkwwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.335805 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4c585254-06f5-4cf1-bc73-27d3b04795f0" (UID: "4c585254-06f5-4cf1-bc73-27d3b04795f0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.384203 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c585254-06f5-4cf1-bc73-27d3b04795f0" (UID: "4c585254-06f5-4cf1-bc73-27d3b04795f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.406623 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-config-data" (OuterVolumeSpecName: "config-data") pod "4c585254-06f5-4cf1-bc73-27d3b04795f0" (UID: "4c585254-06f5-4cf1-bc73-27d3b04795f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.407572 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-config-data\") pod \"4c585254-06f5-4cf1-bc73-27d3b04795f0\" (UID: \"4c585254-06f5-4cf1-bc73-27d3b04795f0\") " Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.408172 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkwwx\" (UniqueName: \"kubernetes.io/projected/4c585254-06f5-4cf1-bc73-27d3b04795f0-kube-api-access-tkwwx\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.408193 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.408203 4696 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.408213 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:51 crc kubenswrapper[4696]: W0318 15:58:51.408302 4696 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/4c585254-06f5-4cf1-bc73-27d3b04795f0/volumes/kubernetes.io~secret/config-data Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.408315 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-config-data" (OuterVolumeSpecName: "config-data") pod "4c585254-06f5-4cf1-bc73-27d3b04795f0" (UID: "4c585254-06f5-4cf1-bc73-27d3b04795f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.478433 4696 generic.go:334] "Generic (PLEG): container finished" podID="4c585254-06f5-4cf1-bc73-27d3b04795f0" containerID="df120cc5d308cf571b9ffea895baf15f02faa145a45dc7b995de9fe9bce01129" exitCode=137 Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.478503 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c585254-06f5-4cf1-bc73-27d3b04795f0","Type":"ContainerDied","Data":"df120cc5d308cf571b9ffea895baf15f02faa145a45dc7b995de9fe9bce01129"} Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.478572 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c585254-06f5-4cf1-bc73-27d3b04795f0","Type":"ContainerDied","Data":"5941e3060510fb9db038e5517fcbbe5053e7554660cf57030e70eaef3422b99b"} Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.478601 4696 scope.go:117] "RemoveContainer" containerID="df120cc5d308cf571b9ffea895baf15f02faa145a45dc7b995de9fe9bce01129" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.478825 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.511622 4696 scope.go:117] "RemoveContainer" containerID="d5eff175377934b3e9d69dbf27c4bb389f60c939fba10320f826cdc68494f2e4" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.513492 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c585254-06f5-4cf1-bc73-27d3b04795f0-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.539198 4696 scope.go:117] "RemoveContainer" containerID="b973703b54594b73c9c3907eac11c2ed9bd05b0c5e256c40521fdda0988701cd" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.547338 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.558421 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.563770 4696 scope.go:117] "RemoveContainer" containerID="a9112436017efdb616a37c02b816428af03126e10676e4a36078d2e5dcfbc66d" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.619185 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c585254-06f5-4cf1-bc73-27d3b04795f0" path="/var/lib/kubelet/pods/4c585254-06f5-4cf1-bc73-27d3b04795f0/volumes" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.621537 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:58:51 crc kubenswrapper[4696]: E0318 15:58:51.622357 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c585254-06f5-4cf1-bc73-27d3b04795f0" containerName="ceilometer-central-agent" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.622405 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c585254-06f5-4cf1-bc73-27d3b04795f0" containerName="ceilometer-central-agent" Mar 18 15:58:51 crc kubenswrapper[4696]: E0318 15:58:51.622436 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72021b21-00cf-4c33-be2d-b24f20dc0f9f" containerName="horizon-log" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.622446 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="72021b21-00cf-4c33-be2d-b24f20dc0f9f" containerName="horizon-log" Mar 18 15:58:51 crc kubenswrapper[4696]: E0318 15:58:51.622495 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c585254-06f5-4cf1-bc73-27d3b04795f0" containerName="proxy-httpd" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.622506 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c585254-06f5-4cf1-bc73-27d3b04795f0" containerName="proxy-httpd" Mar 18 15:58:51 crc kubenswrapper[4696]: E0318 15:58:51.622943 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c585254-06f5-4cf1-bc73-27d3b04795f0" containerName="sg-core" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.622959 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c585254-06f5-4cf1-bc73-27d3b04795f0" containerName="sg-core" Mar 18 15:58:51 crc kubenswrapper[4696]: E0318 15:58:51.622973 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72021b21-00cf-4c33-be2d-b24f20dc0f9f" containerName="horizon" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.623009 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="72021b21-00cf-4c33-be2d-b24f20dc0f9f" containerName="horizon" Mar 18 15:58:51 crc kubenswrapper[4696]: E0318 15:58:51.623039 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c585254-06f5-4cf1-bc73-27d3b04795f0" containerName="ceilometer-notification-agent" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.623049 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c585254-06f5-4cf1-bc73-27d3b04795f0" containerName="ceilometer-notification-agent" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.623455 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c585254-06f5-4cf1-bc73-27d3b04795f0" containerName="ceilometer-central-agent" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.623501 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c585254-06f5-4cf1-bc73-27d3b04795f0" containerName="ceilometer-notification-agent" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.623547 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c585254-06f5-4cf1-bc73-27d3b04795f0" containerName="proxy-httpd" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.623564 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c585254-06f5-4cf1-bc73-27d3b04795f0" containerName="sg-core" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.623575 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="72021b21-00cf-4c33-be2d-b24f20dc0f9f" containerName="horizon-log" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.623585 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="72021b21-00cf-4c33-be2d-b24f20dc0f9f" containerName="horizon" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.627233 4696 scope.go:117] "RemoveContainer" containerID="df120cc5d308cf571b9ffea895baf15f02faa145a45dc7b995de9fe9bce01129" Mar 18 15:58:51 crc kubenswrapper[4696]: E0318 15:58:51.627887 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df120cc5d308cf571b9ffea895baf15f02faa145a45dc7b995de9fe9bce01129\": container with ID starting with df120cc5d308cf571b9ffea895baf15f02faa145a45dc7b995de9fe9bce01129 not found: ID does not exist" containerID="df120cc5d308cf571b9ffea895baf15f02faa145a45dc7b995de9fe9bce01129" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.627943 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df120cc5d308cf571b9ffea895baf15f02faa145a45dc7b995de9fe9bce01129"} err="failed to get container status \"df120cc5d308cf571b9ffea895baf15f02faa145a45dc7b995de9fe9bce01129\": rpc error: code = NotFound desc = could not find container \"df120cc5d308cf571b9ffea895baf15f02faa145a45dc7b995de9fe9bce01129\": container with ID starting with df120cc5d308cf571b9ffea895baf15f02faa145a45dc7b995de9fe9bce01129 not found: ID does not exist" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.627979 4696 scope.go:117] "RemoveContainer" containerID="d5eff175377934b3e9d69dbf27c4bb389f60c939fba10320f826cdc68494f2e4" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.634410 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 15:58:51 crc kubenswrapper[4696]: E0318 15:58:51.641699 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5eff175377934b3e9d69dbf27c4bb389f60c939fba10320f826cdc68494f2e4\": container with ID starting with d5eff175377934b3e9d69dbf27c4bb389f60c939fba10320f826cdc68494f2e4 not found: ID does not exist" containerID="d5eff175377934b3e9d69dbf27c4bb389f60c939fba10320f826cdc68494f2e4" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.641761 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5eff175377934b3e9d69dbf27c4bb389f60c939fba10320f826cdc68494f2e4"} err="failed to get container status \"d5eff175377934b3e9d69dbf27c4bb389f60c939fba10320f826cdc68494f2e4\": rpc error: code = NotFound desc = could not find container \"d5eff175377934b3e9d69dbf27c4bb389f60c939fba10320f826cdc68494f2e4\": container with ID starting with d5eff175377934b3e9d69dbf27c4bb389f60c939fba10320f826cdc68494f2e4 not found: ID does not exist" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.641793 4696 scope.go:117] "RemoveContainer" containerID="b973703b54594b73c9c3907eac11c2ed9bd05b0c5e256c40521fdda0988701cd" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.642229 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.642400 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 15:58:51 crc kubenswrapper[4696]: E0318 15:58:51.643351 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b973703b54594b73c9c3907eac11c2ed9bd05b0c5e256c40521fdda0988701cd\": container with ID starting with b973703b54594b73c9c3907eac11c2ed9bd05b0c5e256c40521fdda0988701cd not found: ID does not exist" containerID="b973703b54594b73c9c3907eac11c2ed9bd05b0c5e256c40521fdda0988701cd" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.643413 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b973703b54594b73c9c3907eac11c2ed9bd05b0c5e256c40521fdda0988701cd"} err="failed to get container status \"b973703b54594b73c9c3907eac11c2ed9bd05b0c5e256c40521fdda0988701cd\": rpc error: code = NotFound desc = could not find container \"b973703b54594b73c9c3907eac11c2ed9bd05b0c5e256c40521fdda0988701cd\": container with ID starting with b973703b54594b73c9c3907eac11c2ed9bd05b0c5e256c40521fdda0988701cd not found: ID does not exist" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.643455 4696 scope.go:117] "RemoveContainer" containerID="a9112436017efdb616a37c02b816428af03126e10676e4a36078d2e5dcfbc66d" Mar 18 15:58:51 crc kubenswrapper[4696]: E0318 15:58:51.644633 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9112436017efdb616a37c02b816428af03126e10676e4a36078d2e5dcfbc66d\": container with ID starting with a9112436017efdb616a37c02b816428af03126e10676e4a36078d2e5dcfbc66d not found: ID does not exist" containerID="a9112436017efdb616a37c02b816428af03126e10676e4a36078d2e5dcfbc66d" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.644668 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9112436017efdb616a37c02b816428af03126e10676e4a36078d2e5dcfbc66d"} err="failed to get container status \"a9112436017efdb616a37c02b816428af03126e10676e4a36078d2e5dcfbc66d\": rpc error: code = NotFound desc = could not find container \"a9112436017efdb616a37c02b816428af03126e10676e4a36078d2e5dcfbc66d\": container with ID starting with a9112436017efdb616a37c02b816428af03126e10676e4a36078d2e5dcfbc66d not found: ID does not exist" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.669138 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.724007 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e892acc-6409-4459-8950-cdd7d43a024d-scripts\") pod \"ceilometer-0\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " pod="openstack/ceilometer-0" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.724064 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwr2r\" (UniqueName: \"kubernetes.io/projected/1e892acc-6409-4459-8950-cdd7d43a024d-kube-api-access-nwr2r\") pod \"ceilometer-0\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " pod="openstack/ceilometer-0" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.724087 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e892acc-6409-4459-8950-cdd7d43a024d-log-httpd\") pod \"ceilometer-0\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " pod="openstack/ceilometer-0" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.724128 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e892acc-6409-4459-8950-cdd7d43a024d-run-httpd\") pod \"ceilometer-0\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " pod="openstack/ceilometer-0" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.724154 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e892acc-6409-4459-8950-cdd7d43a024d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " pod="openstack/ceilometer-0" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.724201 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e892acc-6409-4459-8950-cdd7d43a024d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " pod="openstack/ceilometer-0" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.724348 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e892acc-6409-4459-8950-cdd7d43a024d-config-data\") pod \"ceilometer-0\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " pod="openstack/ceilometer-0" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.826289 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e892acc-6409-4459-8950-cdd7d43a024d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " pod="openstack/ceilometer-0" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.826369 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e892acc-6409-4459-8950-cdd7d43a024d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " pod="openstack/ceilometer-0" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.826488 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e892acc-6409-4459-8950-cdd7d43a024d-config-data\") pod \"ceilometer-0\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " pod="openstack/ceilometer-0" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.826508 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e892acc-6409-4459-8950-cdd7d43a024d-scripts\") pod \"ceilometer-0\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " pod="openstack/ceilometer-0" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.826548 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwr2r\" (UniqueName: \"kubernetes.io/projected/1e892acc-6409-4459-8950-cdd7d43a024d-kube-api-access-nwr2r\") pod \"ceilometer-0\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " pod="openstack/ceilometer-0" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.826568 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e892acc-6409-4459-8950-cdd7d43a024d-log-httpd\") pod \"ceilometer-0\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " pod="openstack/ceilometer-0" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.826597 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e892acc-6409-4459-8950-cdd7d43a024d-run-httpd\") pod \"ceilometer-0\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " pod="openstack/ceilometer-0" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.827068 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e892acc-6409-4459-8950-cdd7d43a024d-run-httpd\") pod \"ceilometer-0\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " pod="openstack/ceilometer-0" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.827665 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e892acc-6409-4459-8950-cdd7d43a024d-log-httpd\") pod \"ceilometer-0\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " pod="openstack/ceilometer-0" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.831552 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e892acc-6409-4459-8950-cdd7d43a024d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " pod="openstack/ceilometer-0" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.831589 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e892acc-6409-4459-8950-cdd7d43a024d-scripts\") pod \"ceilometer-0\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " pod="openstack/ceilometer-0" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.831574 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e892acc-6409-4459-8950-cdd7d43a024d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " pod="openstack/ceilometer-0" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.835623 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e892acc-6409-4459-8950-cdd7d43a024d-config-data\") pod \"ceilometer-0\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " pod="openstack/ceilometer-0" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.852981 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwr2r\" (UniqueName: \"kubernetes.io/projected/1e892acc-6409-4459-8950-cdd7d43a024d-kube-api-access-nwr2r\") pod \"ceilometer-0\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " pod="openstack/ceilometer-0" Mar 18 15:58:51 crc kubenswrapper[4696]: I0318 15:58:51.978029 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 15:58:52 crc kubenswrapper[4696]: I0318 15:58:52.499656 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:58:52 crc kubenswrapper[4696]: I0318 15:58:52.518222 4696 generic.go:334] "Generic (PLEG): container finished" podID="02c1777e-d983-4003-a570-ce8c867cb635" containerID="2ba0a95e059696a948f85ffaffff7aa7263f9f6928a81c9152a6782f7a8cd566" exitCode=0 Mar 18 15:58:52 crc kubenswrapper[4696]: I0318 15:58:52.518302 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-b9zfd" event={"ID":"02c1777e-d983-4003-a570-ce8c867cb635","Type":"ContainerDied","Data":"2ba0a95e059696a948f85ffaffff7aa7263f9f6928a81c9152a6782f7a8cd566"} Mar 18 15:58:53 crc kubenswrapper[4696]: I0318 15:58:53.532232 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e892acc-6409-4459-8950-cdd7d43a024d","Type":"ContainerStarted","Data":"60aa26acd4529ec352cfd90645239c1c7b78aaa0ff7d8cfe19e4588c5c83fcef"} Mar 18 15:58:53 crc kubenswrapper[4696]: I0318 15:58:53.534135 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e892acc-6409-4459-8950-cdd7d43a024d","Type":"ContainerStarted","Data":"43fa062e0761582a5d37351b803ecf46aa6897d02c6274c51674937418872263"} Mar 18 15:58:53 crc kubenswrapper[4696]: I0318 15:58:53.880723 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-b9zfd" Mar 18 15:58:53 crc kubenswrapper[4696]: I0318 15:58:53.975080 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02c1777e-d983-4003-a570-ce8c867cb635-scripts\") pod \"02c1777e-d983-4003-a570-ce8c867cb635\" (UID: \"02c1777e-d983-4003-a570-ce8c867cb635\") " Mar 18 15:58:53 crc kubenswrapper[4696]: I0318 15:58:53.975371 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02c1777e-d983-4003-a570-ce8c867cb635-config-data\") pod \"02c1777e-d983-4003-a570-ce8c867cb635\" (UID: \"02c1777e-d983-4003-a570-ce8c867cb635\") " Mar 18 15:58:53 crc kubenswrapper[4696]: I0318 15:58:53.976272 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c1777e-d983-4003-a570-ce8c867cb635-combined-ca-bundle\") pod \"02c1777e-d983-4003-a570-ce8c867cb635\" (UID: \"02c1777e-d983-4003-a570-ce8c867cb635\") " Mar 18 15:58:53 crc kubenswrapper[4696]: I0318 15:58:53.976471 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xcqn\" (UniqueName: \"kubernetes.io/projected/02c1777e-d983-4003-a570-ce8c867cb635-kube-api-access-9xcqn\") pod \"02c1777e-d983-4003-a570-ce8c867cb635\" (UID: \"02c1777e-d983-4003-a570-ce8c867cb635\") " Mar 18 15:58:53 crc kubenswrapper[4696]: I0318 15:58:53.979946 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02c1777e-d983-4003-a570-ce8c867cb635-kube-api-access-9xcqn" (OuterVolumeSpecName: "kube-api-access-9xcqn") pod "02c1777e-d983-4003-a570-ce8c867cb635" (UID: "02c1777e-d983-4003-a570-ce8c867cb635"). InnerVolumeSpecName "kube-api-access-9xcqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:53 crc kubenswrapper[4696]: I0318 15:58:53.980099 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02c1777e-d983-4003-a570-ce8c867cb635-scripts" (OuterVolumeSpecName: "scripts") pod "02c1777e-d983-4003-a570-ce8c867cb635" (UID: "02c1777e-d983-4003-a570-ce8c867cb635"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:54 crc kubenswrapper[4696]: I0318 15:58:54.004593 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02c1777e-d983-4003-a570-ce8c867cb635-config-data" (OuterVolumeSpecName: "config-data") pod "02c1777e-d983-4003-a570-ce8c867cb635" (UID: "02c1777e-d983-4003-a570-ce8c867cb635"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:54 crc kubenswrapper[4696]: I0318 15:58:54.009122 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02c1777e-d983-4003-a570-ce8c867cb635-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02c1777e-d983-4003-a570-ce8c867cb635" (UID: "02c1777e-d983-4003-a570-ce8c867cb635"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:54 crc kubenswrapper[4696]: I0318 15:58:54.079707 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02c1777e-d983-4003-a570-ce8c867cb635-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:54 crc kubenswrapper[4696]: I0318 15:58:54.079756 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c1777e-d983-4003-a570-ce8c867cb635-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:54 crc kubenswrapper[4696]: I0318 15:58:54.079777 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xcqn\" (UniqueName: \"kubernetes.io/projected/02c1777e-d983-4003-a570-ce8c867cb635-kube-api-access-9xcqn\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:54 crc kubenswrapper[4696]: I0318 15:58:54.079789 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02c1777e-d983-4003-a570-ce8c867cb635-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:54 crc kubenswrapper[4696]: I0318 15:58:54.547188 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e892acc-6409-4459-8950-cdd7d43a024d","Type":"ContainerStarted","Data":"81fb00d36f055ee95f4212ac31518d5530b266d7168fd7bd517fd578db9a1593"} Mar 18 15:58:54 crc kubenswrapper[4696]: I0318 15:58:54.551513 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-b9zfd" event={"ID":"02c1777e-d983-4003-a570-ce8c867cb635","Type":"ContainerDied","Data":"504428eb90aeeb7d3170b9180b5d88a6facd719c54057da59782c627b01a330d"} Mar 18 15:58:54 crc kubenswrapper[4696]: I0318 15:58:54.551587 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="504428eb90aeeb7d3170b9180b5d88a6facd719c54057da59782c627b01a330d" Mar 18 15:58:54 crc kubenswrapper[4696]: I0318 15:58:54.551672 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-b9zfd" Mar 18 15:58:54 crc kubenswrapper[4696]: I0318 15:58:54.730761 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 15:58:54 crc kubenswrapper[4696]: E0318 15:58:54.731595 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02c1777e-d983-4003-a570-ce8c867cb635" containerName="nova-cell0-conductor-db-sync" Mar 18 15:58:54 crc kubenswrapper[4696]: I0318 15:58:54.731616 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="02c1777e-d983-4003-a570-ce8c867cb635" containerName="nova-cell0-conductor-db-sync" Mar 18 15:58:54 crc kubenswrapper[4696]: I0318 15:58:54.731857 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="02c1777e-d983-4003-a570-ce8c867cb635" containerName="nova-cell0-conductor-db-sync" Mar 18 15:58:54 crc kubenswrapper[4696]: I0318 15:58:54.732924 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 15:58:54 crc kubenswrapper[4696]: I0318 15:58:54.738087 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 15:58:54 crc kubenswrapper[4696]: I0318 15:58:54.738283 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gxncm" Mar 18 15:58:54 crc kubenswrapper[4696]: I0318 15:58:54.745436 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 15:58:54 crc kubenswrapper[4696]: I0318 15:58:54.894958 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4zst\" (UniqueName: \"kubernetes.io/projected/73d0831c-120c-43ad-8e4e-cb796c5fb554-kube-api-access-g4zst\") pod \"nova-cell0-conductor-0\" (UID: \"73d0831c-120c-43ad-8e4e-cb796c5fb554\") " pod="openstack/nova-cell0-conductor-0" Mar 18 15:58:54 crc kubenswrapper[4696]: I0318 15:58:54.895173 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d0831c-120c-43ad-8e4e-cb796c5fb554-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"73d0831c-120c-43ad-8e4e-cb796c5fb554\") " pod="openstack/nova-cell0-conductor-0" Mar 18 15:58:54 crc kubenswrapper[4696]: I0318 15:58:54.895233 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d0831c-120c-43ad-8e4e-cb796c5fb554-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"73d0831c-120c-43ad-8e4e-cb796c5fb554\") " pod="openstack/nova-cell0-conductor-0" Mar 18 15:58:54 crc kubenswrapper[4696]: I0318 15:58:54.997551 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d0831c-120c-43ad-8e4e-cb796c5fb554-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"73d0831c-120c-43ad-8e4e-cb796c5fb554\") " pod="openstack/nova-cell0-conductor-0" Mar 18 15:58:54 crc kubenswrapper[4696]: I0318 15:58:54.997966 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d0831c-120c-43ad-8e4e-cb796c5fb554-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"73d0831c-120c-43ad-8e4e-cb796c5fb554\") " pod="openstack/nova-cell0-conductor-0" Mar 18 15:58:54 crc kubenswrapper[4696]: I0318 15:58:54.998090 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4zst\" (UniqueName: \"kubernetes.io/projected/73d0831c-120c-43ad-8e4e-cb796c5fb554-kube-api-access-g4zst\") pod \"nova-cell0-conductor-0\" (UID: \"73d0831c-120c-43ad-8e4e-cb796c5fb554\") " pod="openstack/nova-cell0-conductor-0" Mar 18 15:58:55 crc kubenswrapper[4696]: I0318 15:58:55.004805 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d0831c-120c-43ad-8e4e-cb796c5fb554-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"73d0831c-120c-43ad-8e4e-cb796c5fb554\") " pod="openstack/nova-cell0-conductor-0" Mar 18 15:58:55 crc kubenswrapper[4696]: I0318 15:58:55.005857 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d0831c-120c-43ad-8e4e-cb796c5fb554-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"73d0831c-120c-43ad-8e4e-cb796c5fb554\") " pod="openstack/nova-cell0-conductor-0" Mar 18 15:58:55 crc kubenswrapper[4696]: I0318 15:58:55.019952 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4zst\" (UniqueName: \"kubernetes.io/projected/73d0831c-120c-43ad-8e4e-cb796c5fb554-kube-api-access-g4zst\") pod \"nova-cell0-conductor-0\" (UID: \"73d0831c-120c-43ad-8e4e-cb796c5fb554\") " pod="openstack/nova-cell0-conductor-0" Mar 18 15:58:55 crc kubenswrapper[4696]: I0318 15:58:55.083081 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 15:58:55 crc kubenswrapper[4696]: I0318 15:58:55.565850 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e892acc-6409-4459-8950-cdd7d43a024d","Type":"ContainerStarted","Data":"956c2f0d4513911e55db42773372c826373e728cfaaaca203528ed94234d47f0"} Mar 18 15:58:55 crc kubenswrapper[4696]: I0318 15:58:55.648361 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 15:58:55 crc kubenswrapper[4696]: I0318 15:58:55.985597 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 15:58:56 crc kubenswrapper[4696]: I0318 15:58:56.583747 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"73d0831c-120c-43ad-8e4e-cb796c5fb554","Type":"ContainerStarted","Data":"8e6768584d4dd707802d8193590777aea9fbce027a39c4ea8cea1278810bf30c"} Mar 18 15:58:56 crc kubenswrapper[4696]: I0318 15:58:56.585318 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"73d0831c-120c-43ad-8e4e-cb796c5fb554","Type":"ContainerStarted","Data":"d14d7fab81d4d80fe4e8e57ac7783bd5f6fbf83a01b4d74739fe7c45a80424ca"} Mar 18 15:58:56 crc kubenswrapper[4696]: I0318 15:58:56.585450 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 18 15:58:56 crc kubenswrapper[4696]: I0318 15:58:56.606858 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.606827823 podStartE2EDuration="2.606827823s" podCreationTimestamp="2026-03-18 15:58:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:58:56.60069468 +0000 UTC m=+1379.606868886" watchObservedRunningTime="2026-03-18 15:58:56.606827823 +0000 UTC m=+1379.613002029" Mar 18 15:58:57 crc kubenswrapper[4696]: I0318 15:58:57.603971 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="73d0831c-120c-43ad-8e4e-cb796c5fb554" containerName="nova-cell0-conductor-conductor" containerID="cri-o://8e6768584d4dd707802d8193590777aea9fbce027a39c4ea8cea1278810bf30c" gracePeriod=30 Mar 18 15:58:57 crc kubenswrapper[4696]: I0318 15:58:57.621867 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 15:58:57 crc kubenswrapper[4696]: I0318 15:58:57.621918 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e892acc-6409-4459-8950-cdd7d43a024d","Type":"ContainerStarted","Data":"9052e76d7279f58ea1d24b44eda305cb4f7c9d18f0a5ffd66a474cd67ae727bf"} Mar 18 15:58:57 crc kubenswrapper[4696]: I0318 15:58:57.679172 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.293367098 podStartE2EDuration="6.679142853s" podCreationTimestamp="2026-03-18 15:58:51 +0000 UTC" firstStartedPulling="2026-03-18 15:58:52.517022565 +0000 UTC m=+1375.523196771" lastFinishedPulling="2026-03-18 15:58:56.90279832 +0000 UTC m=+1379.908972526" observedRunningTime="2026-03-18 15:58:57.66507082 +0000 UTC m=+1380.671245026" watchObservedRunningTime="2026-03-18 15:58:57.679142853 +0000 UTC m=+1380.685317059" Mar 18 15:58:57 crc kubenswrapper[4696]: I0318 15:58:57.991645 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:58:59 crc kubenswrapper[4696]: I0318 15:58:59.616812 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1e892acc-6409-4459-8950-cdd7d43a024d" containerName="ceilometer-central-agent" containerID="cri-o://60aa26acd4529ec352cfd90645239c1c7b78aaa0ff7d8cfe19e4588c5c83fcef" gracePeriod=30 Mar 18 15:58:59 crc kubenswrapper[4696]: I0318 15:58:59.619959 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1e892acc-6409-4459-8950-cdd7d43a024d" containerName="proxy-httpd" containerID="cri-o://9052e76d7279f58ea1d24b44eda305cb4f7c9d18f0a5ffd66a474cd67ae727bf" gracePeriod=30 Mar 18 15:58:59 crc kubenswrapper[4696]: I0318 15:58:59.620116 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1e892acc-6409-4459-8950-cdd7d43a024d" containerName="sg-core" containerID="cri-o://956c2f0d4513911e55db42773372c826373e728cfaaaca203528ed94234d47f0" gracePeriod=30 Mar 18 15:58:59 crc kubenswrapper[4696]: I0318 15:58:59.620937 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1e892acc-6409-4459-8950-cdd7d43a024d" containerName="ceilometer-notification-agent" containerID="cri-o://81fb00d36f055ee95f4212ac31518d5530b266d7168fd7bd517fd578db9a1593" gracePeriod=30 Mar 18 15:59:00 crc kubenswrapper[4696]: E0318 15:59:00.087067 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6768584d4dd707802d8193590777aea9fbce027a39c4ea8cea1278810bf30c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 15:59:00 crc kubenswrapper[4696]: E0318 15:59:00.089654 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6768584d4dd707802d8193590777aea9fbce027a39c4ea8cea1278810bf30c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 15:59:00 crc kubenswrapper[4696]: E0318 15:59:00.091810 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6768584d4dd707802d8193590777aea9fbce027a39c4ea8cea1278810bf30c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 15:59:00 crc kubenswrapper[4696]: E0318 15:59:00.091934 4696 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="73d0831c-120c-43ad-8e4e-cb796c5fb554" containerName="nova-cell0-conductor-conductor" Mar 18 15:59:00 crc kubenswrapper[4696]: I0318 15:59:00.659187 4696 generic.go:334] "Generic (PLEG): container finished" podID="1e892acc-6409-4459-8950-cdd7d43a024d" containerID="9052e76d7279f58ea1d24b44eda305cb4f7c9d18f0a5ffd66a474cd67ae727bf" exitCode=0 Mar 18 15:59:00 crc kubenswrapper[4696]: I0318 15:59:00.659228 4696 generic.go:334] "Generic (PLEG): container finished" podID="1e892acc-6409-4459-8950-cdd7d43a024d" containerID="956c2f0d4513911e55db42773372c826373e728cfaaaca203528ed94234d47f0" exitCode=2 Mar 18 15:59:00 crc kubenswrapper[4696]: I0318 15:59:00.659235 4696 generic.go:334] "Generic (PLEG): container finished" podID="1e892acc-6409-4459-8950-cdd7d43a024d" containerID="81fb00d36f055ee95f4212ac31518d5530b266d7168fd7bd517fd578db9a1593" exitCode=0 Mar 18 15:59:00 crc kubenswrapper[4696]: I0318 15:59:00.659265 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e892acc-6409-4459-8950-cdd7d43a024d","Type":"ContainerDied","Data":"9052e76d7279f58ea1d24b44eda305cb4f7c9d18f0a5ffd66a474cd67ae727bf"} Mar 18 15:59:00 crc kubenswrapper[4696]: I0318 15:59:00.659301 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e892acc-6409-4459-8950-cdd7d43a024d","Type":"ContainerDied","Data":"956c2f0d4513911e55db42773372c826373e728cfaaaca203528ed94234d47f0"} Mar 18 15:59:00 crc kubenswrapper[4696]: I0318 15:59:00.659311 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e892acc-6409-4459-8950-cdd7d43a024d","Type":"ContainerDied","Data":"81fb00d36f055ee95f4212ac31518d5530b266d7168fd7bd517fd578db9a1593"} Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.396122 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.536148 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e892acc-6409-4459-8950-cdd7d43a024d-sg-core-conf-yaml\") pod \"1e892acc-6409-4459-8950-cdd7d43a024d\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.536268 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e892acc-6409-4459-8950-cdd7d43a024d-combined-ca-bundle\") pod \"1e892acc-6409-4459-8950-cdd7d43a024d\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.536366 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e892acc-6409-4459-8950-cdd7d43a024d-run-httpd\") pod \"1e892acc-6409-4459-8950-cdd7d43a024d\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.536426 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwr2r\" (UniqueName: \"kubernetes.io/projected/1e892acc-6409-4459-8950-cdd7d43a024d-kube-api-access-nwr2r\") pod \"1e892acc-6409-4459-8950-cdd7d43a024d\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.536553 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e892acc-6409-4459-8950-cdd7d43a024d-scripts\") pod \"1e892acc-6409-4459-8950-cdd7d43a024d\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.536587 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e892acc-6409-4459-8950-cdd7d43a024d-log-httpd\") pod \"1e892acc-6409-4459-8950-cdd7d43a024d\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.536735 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e892acc-6409-4459-8950-cdd7d43a024d-config-data\") pod \"1e892acc-6409-4459-8950-cdd7d43a024d\" (UID: \"1e892acc-6409-4459-8950-cdd7d43a024d\") " Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.537411 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e892acc-6409-4459-8950-cdd7d43a024d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1e892acc-6409-4459-8950-cdd7d43a024d" (UID: "1e892acc-6409-4459-8950-cdd7d43a024d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.537743 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e892acc-6409-4459-8950-cdd7d43a024d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1e892acc-6409-4459-8950-cdd7d43a024d" (UID: "1e892acc-6409-4459-8950-cdd7d43a024d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.543333 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e892acc-6409-4459-8950-cdd7d43a024d-kube-api-access-nwr2r" (OuterVolumeSpecName: "kube-api-access-nwr2r") pod "1e892acc-6409-4459-8950-cdd7d43a024d" (UID: "1e892acc-6409-4459-8950-cdd7d43a024d"). InnerVolumeSpecName "kube-api-access-nwr2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.543440 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e892acc-6409-4459-8950-cdd7d43a024d-scripts" (OuterVolumeSpecName: "scripts") pod "1e892acc-6409-4459-8950-cdd7d43a024d" (UID: "1e892acc-6409-4459-8950-cdd7d43a024d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.589728 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e892acc-6409-4459-8950-cdd7d43a024d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1e892acc-6409-4459-8950-cdd7d43a024d" (UID: "1e892acc-6409-4459-8950-cdd7d43a024d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.619739 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e892acc-6409-4459-8950-cdd7d43a024d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e892acc-6409-4459-8950-cdd7d43a024d" (UID: "1e892acc-6409-4459-8950-cdd7d43a024d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.639294 4696 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e892acc-6409-4459-8950-cdd7d43a024d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.639340 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e892acc-6409-4459-8950-cdd7d43a024d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.639353 4696 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e892acc-6409-4459-8950-cdd7d43a024d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.639365 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwr2r\" (UniqueName: \"kubernetes.io/projected/1e892acc-6409-4459-8950-cdd7d43a024d-kube-api-access-nwr2r\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.639380 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e892acc-6409-4459-8950-cdd7d43a024d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.639397 4696 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e892acc-6409-4459-8950-cdd7d43a024d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.648107 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e892acc-6409-4459-8950-cdd7d43a024d-config-data" (OuterVolumeSpecName: "config-data") pod "1e892acc-6409-4459-8950-cdd7d43a024d" (UID: "1e892acc-6409-4459-8950-cdd7d43a024d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.703134 4696 generic.go:334] "Generic (PLEG): container finished" podID="1e892acc-6409-4459-8950-cdd7d43a024d" containerID="60aa26acd4529ec352cfd90645239c1c7b78aaa0ff7d8cfe19e4588c5c83fcef" exitCode=0 Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.703192 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e892acc-6409-4459-8950-cdd7d43a024d","Type":"ContainerDied","Data":"60aa26acd4529ec352cfd90645239c1c7b78aaa0ff7d8cfe19e4588c5c83fcef"} Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.703206 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.703236 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e892acc-6409-4459-8950-cdd7d43a024d","Type":"ContainerDied","Data":"43fa062e0761582a5d37351b803ecf46aa6897d02c6274c51674937418872263"} Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.703260 4696 scope.go:117] "RemoveContainer" containerID="9052e76d7279f58ea1d24b44eda305cb4f7c9d18f0a5ffd66a474cd67ae727bf" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.730692 4696 scope.go:117] "RemoveContainer" containerID="956c2f0d4513911e55db42773372c826373e728cfaaaca203528ed94234d47f0" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.743798 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e892acc-6409-4459-8950-cdd7d43a024d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.753991 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.771017 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.772453 4696 scope.go:117] "RemoveContainer" containerID="81fb00d36f055ee95f4212ac31518d5530b266d7168fd7bd517fd578db9a1593" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.787813 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:59:04 crc kubenswrapper[4696]: E0318 15:59:04.788395 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e892acc-6409-4459-8950-cdd7d43a024d" containerName="proxy-httpd" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.788418 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e892acc-6409-4459-8950-cdd7d43a024d" containerName="proxy-httpd" Mar 18 15:59:04 crc kubenswrapper[4696]: E0318 15:59:04.788438 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e892acc-6409-4459-8950-cdd7d43a024d" containerName="ceilometer-central-agent" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.788446 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e892acc-6409-4459-8950-cdd7d43a024d" containerName="ceilometer-central-agent" Mar 18 15:59:04 crc kubenswrapper[4696]: E0318 15:59:04.788467 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e892acc-6409-4459-8950-cdd7d43a024d" containerName="sg-core" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.788474 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e892acc-6409-4459-8950-cdd7d43a024d" containerName="sg-core" Mar 18 15:59:04 crc kubenswrapper[4696]: E0318 15:59:04.788515 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e892acc-6409-4459-8950-cdd7d43a024d" containerName="ceilometer-notification-agent" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.788536 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e892acc-6409-4459-8950-cdd7d43a024d" containerName="ceilometer-notification-agent" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.789328 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e892acc-6409-4459-8950-cdd7d43a024d" containerName="proxy-httpd" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.789355 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e892acc-6409-4459-8950-cdd7d43a024d" containerName="sg-core" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.789371 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e892acc-6409-4459-8950-cdd7d43a024d" containerName="ceilometer-central-agent" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.789393 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e892acc-6409-4459-8950-cdd7d43a024d" containerName="ceilometer-notification-agent" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.791414 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.795111 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.795394 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.802476 4696 scope.go:117] "RemoveContainer" containerID="60aa26acd4529ec352cfd90645239c1c7b78aaa0ff7d8cfe19e4588c5c83fcef" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.816253 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.837461 4696 scope.go:117] "RemoveContainer" containerID="9052e76d7279f58ea1d24b44eda305cb4f7c9d18f0a5ffd66a474cd67ae727bf" Mar 18 15:59:04 crc kubenswrapper[4696]: E0318 15:59:04.838022 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9052e76d7279f58ea1d24b44eda305cb4f7c9d18f0a5ffd66a474cd67ae727bf\": container with ID starting with 9052e76d7279f58ea1d24b44eda305cb4f7c9d18f0a5ffd66a474cd67ae727bf not found: ID does not exist" containerID="9052e76d7279f58ea1d24b44eda305cb4f7c9d18f0a5ffd66a474cd67ae727bf" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.838078 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9052e76d7279f58ea1d24b44eda305cb4f7c9d18f0a5ffd66a474cd67ae727bf"} err="failed to get container status \"9052e76d7279f58ea1d24b44eda305cb4f7c9d18f0a5ffd66a474cd67ae727bf\": rpc error: code = NotFound desc = could not find container \"9052e76d7279f58ea1d24b44eda305cb4f7c9d18f0a5ffd66a474cd67ae727bf\": container with ID starting with 9052e76d7279f58ea1d24b44eda305cb4f7c9d18f0a5ffd66a474cd67ae727bf not found: ID does not exist" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.838118 4696 scope.go:117] "RemoveContainer" containerID="956c2f0d4513911e55db42773372c826373e728cfaaaca203528ed94234d47f0" Mar 18 15:59:04 crc kubenswrapper[4696]: E0318 15:59:04.838431 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"956c2f0d4513911e55db42773372c826373e728cfaaaca203528ed94234d47f0\": container with ID starting with 956c2f0d4513911e55db42773372c826373e728cfaaaca203528ed94234d47f0 not found: ID does not exist" containerID="956c2f0d4513911e55db42773372c826373e728cfaaaca203528ed94234d47f0" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.838463 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"956c2f0d4513911e55db42773372c826373e728cfaaaca203528ed94234d47f0"} err="failed to get container status \"956c2f0d4513911e55db42773372c826373e728cfaaaca203528ed94234d47f0\": rpc error: code = NotFound desc = could not find container \"956c2f0d4513911e55db42773372c826373e728cfaaaca203528ed94234d47f0\": container with ID starting with 956c2f0d4513911e55db42773372c826373e728cfaaaca203528ed94234d47f0 not found: ID does not exist" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.838478 4696 scope.go:117] "RemoveContainer" containerID="81fb00d36f055ee95f4212ac31518d5530b266d7168fd7bd517fd578db9a1593" Mar 18 15:59:04 crc kubenswrapper[4696]: E0318 15:59:04.838845 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81fb00d36f055ee95f4212ac31518d5530b266d7168fd7bd517fd578db9a1593\": container with ID starting with 81fb00d36f055ee95f4212ac31518d5530b266d7168fd7bd517fd578db9a1593 not found: ID does not exist" containerID="81fb00d36f055ee95f4212ac31518d5530b266d7168fd7bd517fd578db9a1593" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.838868 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81fb00d36f055ee95f4212ac31518d5530b266d7168fd7bd517fd578db9a1593"} err="failed to get container status \"81fb00d36f055ee95f4212ac31518d5530b266d7168fd7bd517fd578db9a1593\": rpc error: code = NotFound desc = could not find container \"81fb00d36f055ee95f4212ac31518d5530b266d7168fd7bd517fd578db9a1593\": container with ID starting with 81fb00d36f055ee95f4212ac31518d5530b266d7168fd7bd517fd578db9a1593 not found: ID does not exist" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.838879 4696 scope.go:117] "RemoveContainer" containerID="60aa26acd4529ec352cfd90645239c1c7b78aaa0ff7d8cfe19e4588c5c83fcef" Mar 18 15:59:04 crc kubenswrapper[4696]: E0318 15:59:04.839124 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60aa26acd4529ec352cfd90645239c1c7b78aaa0ff7d8cfe19e4588c5c83fcef\": container with ID starting with 60aa26acd4529ec352cfd90645239c1c7b78aaa0ff7d8cfe19e4588c5c83fcef not found: ID does not exist" containerID="60aa26acd4529ec352cfd90645239c1c7b78aaa0ff7d8cfe19e4588c5c83fcef" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.839147 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60aa26acd4529ec352cfd90645239c1c7b78aaa0ff7d8cfe19e4588c5c83fcef"} err="failed to get container status \"60aa26acd4529ec352cfd90645239c1c7b78aaa0ff7d8cfe19e4588c5c83fcef\": rpc error: code = NotFound desc = could not find container \"60aa26acd4529ec352cfd90645239c1c7b78aaa0ff7d8cfe19e4588c5c83fcef\": container with ID starting with 60aa26acd4529ec352cfd90645239c1c7b78aaa0ff7d8cfe19e4588c5c83fcef not found: ID does not exist" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.949689 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/580fa059-673f-4493-a9b7-25f279ab30fe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " pod="openstack/ceilometer-0" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.950135 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/580fa059-673f-4493-a9b7-25f279ab30fe-log-httpd\") pod \"ceilometer-0\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " pod="openstack/ceilometer-0" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.950377 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/580fa059-673f-4493-a9b7-25f279ab30fe-config-data\") pod \"ceilometer-0\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " pod="openstack/ceilometer-0" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.950501 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/580fa059-673f-4493-a9b7-25f279ab30fe-scripts\") pod \"ceilometer-0\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " pod="openstack/ceilometer-0" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.950732 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/580fa059-673f-4493-a9b7-25f279ab30fe-run-httpd\") pod \"ceilometer-0\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " pod="openstack/ceilometer-0" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.950906 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2d5j\" (UniqueName: \"kubernetes.io/projected/580fa059-673f-4493-a9b7-25f279ab30fe-kube-api-access-s2d5j\") pod \"ceilometer-0\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " pod="openstack/ceilometer-0" Mar 18 15:59:04 crc kubenswrapper[4696]: I0318 15:59:04.950939 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/580fa059-673f-4493-a9b7-25f279ab30fe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " pod="openstack/ceilometer-0" Mar 18 15:59:05 crc kubenswrapper[4696]: I0318 15:59:05.052644 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2d5j\" (UniqueName: \"kubernetes.io/projected/580fa059-673f-4493-a9b7-25f279ab30fe-kube-api-access-s2d5j\") pod \"ceilometer-0\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " pod="openstack/ceilometer-0" Mar 18 15:59:05 crc kubenswrapper[4696]: I0318 15:59:05.052728 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/580fa059-673f-4493-a9b7-25f279ab30fe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " pod="openstack/ceilometer-0" Mar 18 15:59:05 crc kubenswrapper[4696]: I0318 15:59:05.052783 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/580fa059-673f-4493-a9b7-25f279ab30fe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " pod="openstack/ceilometer-0" Mar 18 15:59:05 crc kubenswrapper[4696]: I0318 15:59:05.052811 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/580fa059-673f-4493-a9b7-25f279ab30fe-log-httpd\") pod \"ceilometer-0\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " pod="openstack/ceilometer-0" Mar 18 15:59:05 crc kubenswrapper[4696]: I0318 15:59:05.052910 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/580fa059-673f-4493-a9b7-25f279ab30fe-config-data\") pod \"ceilometer-0\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " pod="openstack/ceilometer-0" Mar 18 15:59:05 crc kubenswrapper[4696]: I0318 15:59:05.052943 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/580fa059-673f-4493-a9b7-25f279ab30fe-scripts\") pod \"ceilometer-0\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " pod="openstack/ceilometer-0" Mar 18 15:59:05 crc kubenswrapper[4696]: I0318 15:59:05.053001 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/580fa059-673f-4493-a9b7-25f279ab30fe-run-httpd\") pod \"ceilometer-0\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " pod="openstack/ceilometer-0" Mar 18 15:59:05 crc kubenswrapper[4696]: I0318 15:59:05.053773 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/580fa059-673f-4493-a9b7-25f279ab30fe-run-httpd\") pod \"ceilometer-0\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " pod="openstack/ceilometer-0" Mar 18 15:59:05 crc kubenswrapper[4696]: I0318 15:59:05.059557 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/580fa059-673f-4493-a9b7-25f279ab30fe-log-httpd\") pod \"ceilometer-0\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " pod="openstack/ceilometer-0" Mar 18 15:59:05 crc kubenswrapper[4696]: I0318 15:59:05.065972 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/580fa059-673f-4493-a9b7-25f279ab30fe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " pod="openstack/ceilometer-0" Mar 18 15:59:05 crc kubenswrapper[4696]: I0318 15:59:05.069916 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/580fa059-673f-4493-a9b7-25f279ab30fe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " pod="openstack/ceilometer-0" Mar 18 15:59:05 crc kubenswrapper[4696]: I0318 15:59:05.070193 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/580fa059-673f-4493-a9b7-25f279ab30fe-scripts\") pod \"ceilometer-0\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " pod="openstack/ceilometer-0" Mar 18 15:59:05 crc kubenswrapper[4696]: I0318 15:59:05.071485 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/580fa059-673f-4493-a9b7-25f279ab30fe-config-data\") pod \"ceilometer-0\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " pod="openstack/ceilometer-0" Mar 18 15:59:05 crc kubenswrapper[4696]: E0318 15:59:05.092988 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6768584d4dd707802d8193590777aea9fbce027a39c4ea8cea1278810bf30c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 15:59:05 crc kubenswrapper[4696]: E0318 15:59:05.096756 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6768584d4dd707802d8193590777aea9fbce027a39c4ea8cea1278810bf30c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 15:59:05 crc kubenswrapper[4696]: I0318 15:59:05.097393 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2d5j\" (UniqueName: \"kubernetes.io/projected/580fa059-673f-4493-a9b7-25f279ab30fe-kube-api-access-s2d5j\") pod \"ceilometer-0\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " pod="openstack/ceilometer-0" Mar 18 15:59:05 crc kubenswrapper[4696]: E0318 15:59:05.098650 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6768584d4dd707802d8193590777aea9fbce027a39c4ea8cea1278810bf30c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 15:59:05 crc kubenswrapper[4696]: E0318 15:59:05.098768 4696 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="73d0831c-120c-43ad-8e4e-cb796c5fb554" containerName="nova-cell0-conductor-conductor" Mar 18 15:59:05 crc kubenswrapper[4696]: I0318 15:59:05.138074 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 15:59:05 crc kubenswrapper[4696]: I0318 15:59:05.609368 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e892acc-6409-4459-8950-cdd7d43a024d" path="/var/lib/kubelet/pods/1e892acc-6409-4459-8950-cdd7d43a024d/volumes" Mar 18 15:59:05 crc kubenswrapper[4696]: I0318 15:59:05.786586 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:59:05 crc kubenswrapper[4696]: W0318 15:59:05.786835 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod580fa059_673f_4493_a9b7_25f279ab30fe.slice/crio-e3de8ee5fca8c9a660ae7ce53f7f2ebcbb7493cb32f15b80e9daffcceaad318c WatchSource:0}: Error finding container e3de8ee5fca8c9a660ae7ce53f7f2ebcbb7493cb32f15b80e9daffcceaad318c: Status 404 returned error can't find the container with id e3de8ee5fca8c9a660ae7ce53f7f2ebcbb7493cb32f15b80e9daffcceaad318c Mar 18 15:59:06 crc kubenswrapper[4696]: I0318 15:59:06.753461 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"580fa059-673f-4493-a9b7-25f279ab30fe","Type":"ContainerStarted","Data":"2e4f5068ec13d822dec65e32bc1da8cb76fd1b106ab7d66cb2ef4152de3ca9ad"} Mar 18 15:59:06 crc kubenswrapper[4696]: I0318 15:59:06.754206 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"580fa059-673f-4493-a9b7-25f279ab30fe","Type":"ContainerStarted","Data":"e3de8ee5fca8c9a660ae7ce53f7f2ebcbb7493cb32f15b80e9daffcceaad318c"} Mar 18 15:59:07 crc kubenswrapper[4696]: I0318 15:59:07.765353 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"580fa059-673f-4493-a9b7-25f279ab30fe","Type":"ContainerStarted","Data":"ac66f19e992c707a33ebac8974904793255c389ce738f37dc2a1185df221d568"} Mar 18 15:59:08 crc kubenswrapper[4696]: I0318 15:59:08.779622 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"580fa059-673f-4493-a9b7-25f279ab30fe","Type":"ContainerStarted","Data":"2659d06d12847d6552344240e8d58c32487edd7bd75913daadf21487c5b9ac4d"} Mar 18 15:59:10 crc kubenswrapper[4696]: E0318 15:59:10.088770 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6768584d4dd707802d8193590777aea9fbce027a39c4ea8cea1278810bf30c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 15:59:10 crc kubenswrapper[4696]: E0318 15:59:10.091208 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6768584d4dd707802d8193590777aea9fbce027a39c4ea8cea1278810bf30c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 15:59:10 crc kubenswrapper[4696]: E0318 15:59:10.092861 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6768584d4dd707802d8193590777aea9fbce027a39c4ea8cea1278810bf30c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 15:59:10 crc kubenswrapper[4696]: E0318 15:59:10.092935 4696 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="73d0831c-120c-43ad-8e4e-cb796c5fb554" containerName="nova-cell0-conductor-conductor" Mar 18 15:59:11 crc kubenswrapper[4696]: I0318 15:59:11.819585 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"580fa059-673f-4493-a9b7-25f279ab30fe","Type":"ContainerStarted","Data":"d33d9fc55f27949c91f38ca64dd04a870bbfb20d76a75555d56cc2fc6de7748e"} Mar 18 15:59:11 crc kubenswrapper[4696]: I0318 15:59:11.820431 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 15:59:11 crc kubenswrapper[4696]: I0318 15:59:11.843441 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.847650666 podStartE2EDuration="7.843415104s" podCreationTimestamp="2026-03-18 15:59:04 +0000 UTC" firstStartedPulling="2026-03-18 15:59:05.79060626 +0000 UTC m=+1388.796780466" lastFinishedPulling="2026-03-18 15:59:10.786370698 +0000 UTC m=+1393.792544904" observedRunningTime="2026-03-18 15:59:11.840767058 +0000 UTC m=+1394.846941264" watchObservedRunningTime="2026-03-18 15:59:11.843415104 +0000 UTC m=+1394.849589410" Mar 18 15:59:12 crc kubenswrapper[4696]: I0318 15:59:12.184210 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:59:12 crc kubenswrapper[4696]: I0318 15:59:12.184285 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:59:15 crc kubenswrapper[4696]: E0318 15:59:15.086841 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6768584d4dd707802d8193590777aea9fbce027a39c4ea8cea1278810bf30c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 15:59:15 crc kubenswrapper[4696]: E0318 15:59:15.089340 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6768584d4dd707802d8193590777aea9fbce027a39c4ea8cea1278810bf30c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 15:59:15 crc kubenswrapper[4696]: E0318 15:59:15.090802 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6768584d4dd707802d8193590777aea9fbce027a39c4ea8cea1278810bf30c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 15:59:15 crc kubenswrapper[4696]: E0318 15:59:15.090839 4696 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="73d0831c-120c-43ad-8e4e-cb796c5fb554" containerName="nova-cell0-conductor-conductor" Mar 18 15:59:20 crc kubenswrapper[4696]: E0318 15:59:20.087120 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6768584d4dd707802d8193590777aea9fbce027a39c4ea8cea1278810bf30c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 15:59:20 crc kubenswrapper[4696]: E0318 15:59:20.089857 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6768584d4dd707802d8193590777aea9fbce027a39c4ea8cea1278810bf30c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 15:59:20 crc kubenswrapper[4696]: E0318 15:59:20.091902 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6768584d4dd707802d8193590777aea9fbce027a39c4ea8cea1278810bf30c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 15:59:20 crc kubenswrapper[4696]: E0318 15:59:20.091969 4696 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="73d0831c-120c-43ad-8e4e-cb796c5fb554" containerName="nova-cell0-conductor-conductor" Mar 18 15:59:25 crc kubenswrapper[4696]: E0318 15:59:25.086968 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6768584d4dd707802d8193590777aea9fbce027a39c4ea8cea1278810bf30c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 15:59:25 crc kubenswrapper[4696]: E0318 15:59:25.089337 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6768584d4dd707802d8193590777aea9fbce027a39c4ea8cea1278810bf30c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 15:59:25 crc kubenswrapper[4696]: E0318 15:59:25.090651 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6768584d4dd707802d8193590777aea9fbce027a39c4ea8cea1278810bf30c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 15:59:25 crc kubenswrapper[4696]: E0318 15:59:25.090690 4696 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="73d0831c-120c-43ad-8e4e-cb796c5fb554" containerName="nova-cell0-conductor-conductor" Mar 18 15:59:28 crc kubenswrapper[4696]: I0318 15:59:28.021897 4696 generic.go:334] "Generic (PLEG): container finished" podID="73d0831c-120c-43ad-8e4e-cb796c5fb554" containerID="8e6768584d4dd707802d8193590777aea9fbce027a39c4ea8cea1278810bf30c" exitCode=137 Mar 18 15:59:28 crc kubenswrapper[4696]: I0318 15:59:28.021952 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"73d0831c-120c-43ad-8e4e-cb796c5fb554","Type":"ContainerDied","Data":"8e6768584d4dd707802d8193590777aea9fbce027a39c4ea8cea1278810bf30c"} Mar 18 15:59:28 crc kubenswrapper[4696]: I0318 15:59:28.022926 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"73d0831c-120c-43ad-8e4e-cb796c5fb554","Type":"ContainerDied","Data":"d14d7fab81d4d80fe4e8e57ac7783bd5f6fbf83a01b4d74739fe7c45a80424ca"} Mar 18 15:59:28 crc kubenswrapper[4696]: I0318 15:59:28.022951 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d14d7fab81d4d80fe4e8e57ac7783bd5f6fbf83a01b4d74739fe7c45a80424ca" Mar 18 15:59:28 crc kubenswrapper[4696]: I0318 15:59:28.037926 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 15:59:28 crc kubenswrapper[4696]: I0318 15:59:28.151290 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d0831c-120c-43ad-8e4e-cb796c5fb554-config-data\") pod \"73d0831c-120c-43ad-8e4e-cb796c5fb554\" (UID: \"73d0831c-120c-43ad-8e4e-cb796c5fb554\") " Mar 18 15:59:28 crc kubenswrapper[4696]: I0318 15:59:28.151649 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4zst\" (UniqueName: \"kubernetes.io/projected/73d0831c-120c-43ad-8e4e-cb796c5fb554-kube-api-access-g4zst\") pod \"73d0831c-120c-43ad-8e4e-cb796c5fb554\" (UID: \"73d0831c-120c-43ad-8e4e-cb796c5fb554\") " Mar 18 15:59:28 crc kubenswrapper[4696]: I0318 15:59:28.151683 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d0831c-120c-43ad-8e4e-cb796c5fb554-combined-ca-bundle\") pod \"73d0831c-120c-43ad-8e4e-cb796c5fb554\" (UID: \"73d0831c-120c-43ad-8e4e-cb796c5fb554\") " Mar 18 15:59:28 crc kubenswrapper[4696]: I0318 15:59:28.160609 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73d0831c-120c-43ad-8e4e-cb796c5fb554-kube-api-access-g4zst" (OuterVolumeSpecName: "kube-api-access-g4zst") pod "73d0831c-120c-43ad-8e4e-cb796c5fb554" (UID: "73d0831c-120c-43ad-8e4e-cb796c5fb554"). InnerVolumeSpecName "kube-api-access-g4zst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:28 crc kubenswrapper[4696]: I0318 15:59:28.186318 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73d0831c-120c-43ad-8e4e-cb796c5fb554-config-data" (OuterVolumeSpecName: "config-data") pod "73d0831c-120c-43ad-8e4e-cb796c5fb554" (UID: "73d0831c-120c-43ad-8e4e-cb796c5fb554"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:28 crc kubenswrapper[4696]: I0318 15:59:28.196380 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73d0831c-120c-43ad-8e4e-cb796c5fb554-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73d0831c-120c-43ad-8e4e-cb796c5fb554" (UID: "73d0831c-120c-43ad-8e4e-cb796c5fb554"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:28 crc kubenswrapper[4696]: I0318 15:59:28.254968 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d0831c-120c-43ad-8e4e-cb796c5fb554-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:28 crc kubenswrapper[4696]: I0318 15:59:28.255012 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4zst\" (UniqueName: \"kubernetes.io/projected/73d0831c-120c-43ad-8e4e-cb796c5fb554-kube-api-access-g4zst\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:28 crc kubenswrapper[4696]: I0318 15:59:28.255025 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d0831c-120c-43ad-8e4e-cb796c5fb554-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.035747 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.097391 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mvr4r"] Mar 18 15:59:29 crc kubenswrapper[4696]: E0318 15:59:29.098810 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d0831c-120c-43ad-8e4e-cb796c5fb554" containerName="nova-cell0-conductor-conductor" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.098840 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d0831c-120c-43ad-8e4e-cb796c5fb554" containerName="nova-cell0-conductor-conductor" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.099098 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="73d0831c-120c-43ad-8e4e-cb796c5fb554" containerName="nova-cell0-conductor-conductor" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.102034 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mvr4r" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.107130 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.119140 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mvr4r"] Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.133675 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.174747 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0644b4b4-173e-45a4-a79d-0127945fbf38-utilities\") pod \"redhat-marketplace-mvr4r\" (UID: \"0644b4b4-173e-45a4-a79d-0127945fbf38\") " pod="openshift-marketplace/redhat-marketplace-mvr4r" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.174820 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0644b4b4-173e-45a4-a79d-0127945fbf38-catalog-content\") pod \"redhat-marketplace-mvr4r\" (UID: \"0644b4b4-173e-45a4-a79d-0127945fbf38\") " pod="openshift-marketplace/redhat-marketplace-mvr4r" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.175006 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gvgs\" (UniqueName: \"kubernetes.io/projected/0644b4b4-173e-45a4-a79d-0127945fbf38-kube-api-access-4gvgs\") pod \"redhat-marketplace-mvr4r\" (UID: \"0644b4b4-173e-45a4-a79d-0127945fbf38\") " pod="openshift-marketplace/redhat-marketplace-mvr4r" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.183736 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.185493 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.188479 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.188590 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gxncm" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.201883 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.277872 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gvgs\" (UniqueName: \"kubernetes.io/projected/0644b4b4-173e-45a4-a79d-0127945fbf38-kube-api-access-4gvgs\") pod \"redhat-marketplace-mvr4r\" (UID: \"0644b4b4-173e-45a4-a79d-0127945fbf38\") " pod="openshift-marketplace/redhat-marketplace-mvr4r" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.277992 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghv54\" (UniqueName: \"kubernetes.io/projected/f6247665-ab0d-4101-acfd-c3da0f598788-kube-api-access-ghv54\") pod \"nova-cell0-conductor-0\" (UID: \"f6247665-ab0d-4101-acfd-c3da0f598788\") " pod="openstack/nova-cell0-conductor-0" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.278060 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0644b4b4-173e-45a4-a79d-0127945fbf38-utilities\") pod \"redhat-marketplace-mvr4r\" (UID: \"0644b4b4-173e-45a4-a79d-0127945fbf38\") " pod="openshift-marketplace/redhat-marketplace-mvr4r" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.278102 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6247665-ab0d-4101-acfd-c3da0f598788-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f6247665-ab0d-4101-acfd-c3da0f598788\") " pod="openstack/nova-cell0-conductor-0" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.278129 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0644b4b4-173e-45a4-a79d-0127945fbf38-catalog-content\") pod \"redhat-marketplace-mvr4r\" (UID: \"0644b4b4-173e-45a4-a79d-0127945fbf38\") " pod="openshift-marketplace/redhat-marketplace-mvr4r" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.278184 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6247665-ab0d-4101-acfd-c3da0f598788-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f6247665-ab0d-4101-acfd-c3da0f598788\") " pod="openstack/nova-cell0-conductor-0" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.278883 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0644b4b4-173e-45a4-a79d-0127945fbf38-utilities\") pod \"redhat-marketplace-mvr4r\" (UID: \"0644b4b4-173e-45a4-a79d-0127945fbf38\") " pod="openshift-marketplace/redhat-marketplace-mvr4r" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.278948 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0644b4b4-173e-45a4-a79d-0127945fbf38-catalog-content\") pod \"redhat-marketplace-mvr4r\" (UID: \"0644b4b4-173e-45a4-a79d-0127945fbf38\") " pod="openshift-marketplace/redhat-marketplace-mvr4r" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.300878 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gvgs\" (UniqueName: \"kubernetes.io/projected/0644b4b4-173e-45a4-a79d-0127945fbf38-kube-api-access-4gvgs\") pod \"redhat-marketplace-mvr4r\" (UID: \"0644b4b4-173e-45a4-a79d-0127945fbf38\") " pod="openshift-marketplace/redhat-marketplace-mvr4r" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.380751 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6247665-ab0d-4101-acfd-c3da0f598788-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f6247665-ab0d-4101-acfd-c3da0f598788\") " pod="openstack/nova-cell0-conductor-0" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.380892 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6247665-ab0d-4101-acfd-c3da0f598788-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f6247665-ab0d-4101-acfd-c3da0f598788\") " pod="openstack/nova-cell0-conductor-0" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.380987 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghv54\" (UniqueName: \"kubernetes.io/projected/f6247665-ab0d-4101-acfd-c3da0f598788-kube-api-access-ghv54\") pod \"nova-cell0-conductor-0\" (UID: \"f6247665-ab0d-4101-acfd-c3da0f598788\") " pod="openstack/nova-cell0-conductor-0" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.386506 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6247665-ab0d-4101-acfd-c3da0f598788-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f6247665-ab0d-4101-acfd-c3da0f598788\") " pod="openstack/nova-cell0-conductor-0" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.387686 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6247665-ab0d-4101-acfd-c3da0f598788-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f6247665-ab0d-4101-acfd-c3da0f598788\") " pod="openstack/nova-cell0-conductor-0" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.404293 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghv54\" (UniqueName: \"kubernetes.io/projected/f6247665-ab0d-4101-acfd-c3da0f598788-kube-api-access-ghv54\") pod \"nova-cell0-conductor-0\" (UID: \"f6247665-ab0d-4101-acfd-c3da0f598788\") " pod="openstack/nova-cell0-conductor-0" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.430833 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mvr4r" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.512483 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.614712 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73d0831c-120c-43ad-8e4e-cb796c5fb554" path="/var/lib/kubelet/pods/73d0831c-120c-43ad-8e4e-cb796c5fb554/volumes" Mar 18 15:59:29 crc kubenswrapper[4696]: I0318 15:59:29.968713 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mvr4r"] Mar 18 15:59:30 crc kubenswrapper[4696]: I0318 15:59:30.055590 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mvr4r" event={"ID":"0644b4b4-173e-45a4-a79d-0127945fbf38","Type":"ContainerStarted","Data":"9de7b7dda8cf05c97dbf56640f13d62e3a77315df2411a3ad435fada7d819b4c"} Mar 18 15:59:30 crc kubenswrapper[4696]: I0318 15:59:30.073493 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 15:59:30 crc kubenswrapper[4696]: W0318 15:59:30.075990 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6247665_ab0d_4101_acfd_c3da0f598788.slice/crio-e0a095d29520c5a12416acf72f7c10998fbf0fe085f5643cb597644895e7ab37 WatchSource:0}: Error finding container e0a095d29520c5a12416acf72f7c10998fbf0fe085f5643cb597644895e7ab37: Status 404 returned error can't find the container with id e0a095d29520c5a12416acf72f7c10998fbf0fe085f5643cb597644895e7ab37 Mar 18 15:59:31 crc kubenswrapper[4696]: I0318 15:59:31.070461 4696 generic.go:334] "Generic (PLEG): container finished" podID="0644b4b4-173e-45a4-a79d-0127945fbf38" containerID="256925df785d959e23310b6e6d935c4da16217ccd1a20a9880748c4766835562" exitCode=0 Mar 18 15:59:31 crc kubenswrapper[4696]: I0318 15:59:31.070583 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mvr4r" event={"ID":"0644b4b4-173e-45a4-a79d-0127945fbf38","Type":"ContainerDied","Data":"256925df785d959e23310b6e6d935c4da16217ccd1a20a9880748c4766835562"} Mar 18 15:59:31 crc kubenswrapper[4696]: I0318 15:59:31.075060 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f6247665-ab0d-4101-acfd-c3da0f598788","Type":"ContainerStarted","Data":"24fc2bc82d07558136bc5f79179d662e4f91062b987f6f0da4780589dbe56f01"} Mar 18 15:59:31 crc kubenswrapper[4696]: I0318 15:59:31.075128 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f6247665-ab0d-4101-acfd-c3da0f598788","Type":"ContainerStarted","Data":"e0a095d29520c5a12416acf72f7c10998fbf0fe085f5643cb597644895e7ab37"} Mar 18 15:59:31 crc kubenswrapper[4696]: I0318 15:59:31.075270 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 18 15:59:31 crc kubenswrapper[4696]: I0318 15:59:31.075747 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 15:59:31 crc kubenswrapper[4696]: I0318 15:59:31.135213 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.135184674 podStartE2EDuration="2.135184674s" podCreationTimestamp="2026-03-18 15:59:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:59:31.124108196 +0000 UTC m=+1414.130282412" watchObservedRunningTime="2026-03-18 15:59:31.135184674 +0000 UTC m=+1414.141358880" Mar 18 15:59:35 crc kubenswrapper[4696]: I0318 15:59:35.140734 4696 generic.go:334] "Generic (PLEG): container finished" podID="0644b4b4-173e-45a4-a79d-0127945fbf38" containerID="8b17630054edf3c80983c9cc1402c8f9da1b3708ac7c143f27613abe076abc76" exitCode=0 Mar 18 15:59:35 crc kubenswrapper[4696]: I0318 15:59:35.140805 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mvr4r" event={"ID":"0644b4b4-173e-45a4-a79d-0127945fbf38","Type":"ContainerDied","Data":"8b17630054edf3c80983c9cc1402c8f9da1b3708ac7c143f27613abe076abc76"} Mar 18 15:59:35 crc kubenswrapper[4696]: I0318 15:59:35.145199 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 15:59:38 crc kubenswrapper[4696]: I0318 15:59:38.985235 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 15:59:38 crc kubenswrapper[4696]: I0318 15:59:38.986088 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="40ba5312-af64-42b5-8f85-cd32aa1dd530" containerName="kube-state-metrics" containerID="cri-o://6555ec005e32e3316a2e560ccc156e6862b4dff3d46764f1c5215afb69a2b996" gracePeriod=30 Mar 18 15:59:39 crc kubenswrapper[4696]: I0318 15:59:39.540975 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.208560 4696 generic.go:334] "Generic (PLEG): container finished" podID="40ba5312-af64-42b5-8f85-cd32aa1dd530" containerID="6555ec005e32e3316a2e560ccc156e6862b4dff3d46764f1c5215afb69a2b996" exitCode=2 Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.208741 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"40ba5312-af64-42b5-8f85-cd32aa1dd530","Type":"ContainerDied","Data":"6555ec005e32e3316a2e560ccc156e6862b4dff3d46764f1c5215afb69a2b996"} Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.372062 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-m56xp"] Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.374342 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-m56xp" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.379059 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.379244 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.395106 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-m56xp"] Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.472797 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29e3998f-8e86-422d-bfdc-093018e4e311-scripts\") pod \"nova-cell0-cell-mapping-m56xp\" (UID: \"29e3998f-8e86-422d-bfdc-093018e4e311\") " pod="openstack/nova-cell0-cell-mapping-m56xp" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.472892 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e3998f-8e86-422d-bfdc-093018e4e311-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-m56xp\" (UID: \"29e3998f-8e86-422d-bfdc-093018e4e311\") " pod="openstack/nova-cell0-cell-mapping-m56xp" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.472968 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l64qw\" (UniqueName: \"kubernetes.io/projected/29e3998f-8e86-422d-bfdc-093018e4e311-kube-api-access-l64qw\") pod \"nova-cell0-cell-mapping-m56xp\" (UID: \"29e3998f-8e86-422d-bfdc-093018e4e311\") " pod="openstack/nova-cell0-cell-mapping-m56xp" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.473043 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29e3998f-8e86-422d-bfdc-093018e4e311-config-data\") pod \"nova-cell0-cell-mapping-m56xp\" (UID: \"29e3998f-8e86-422d-bfdc-093018e4e311\") " pod="openstack/nova-cell0-cell-mapping-m56xp" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.575687 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l64qw\" (UniqueName: \"kubernetes.io/projected/29e3998f-8e86-422d-bfdc-093018e4e311-kube-api-access-l64qw\") pod \"nova-cell0-cell-mapping-m56xp\" (UID: \"29e3998f-8e86-422d-bfdc-093018e4e311\") " pod="openstack/nova-cell0-cell-mapping-m56xp" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.575819 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29e3998f-8e86-422d-bfdc-093018e4e311-config-data\") pod \"nova-cell0-cell-mapping-m56xp\" (UID: \"29e3998f-8e86-422d-bfdc-093018e4e311\") " pod="openstack/nova-cell0-cell-mapping-m56xp" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.575902 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29e3998f-8e86-422d-bfdc-093018e4e311-scripts\") pod \"nova-cell0-cell-mapping-m56xp\" (UID: \"29e3998f-8e86-422d-bfdc-093018e4e311\") " pod="openstack/nova-cell0-cell-mapping-m56xp" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.575963 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e3998f-8e86-422d-bfdc-093018e4e311-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-m56xp\" (UID: \"29e3998f-8e86-422d-bfdc-093018e4e311\") " pod="openstack/nova-cell0-cell-mapping-m56xp" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.584504 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e3998f-8e86-422d-bfdc-093018e4e311-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-m56xp\" (UID: \"29e3998f-8e86-422d-bfdc-093018e4e311\") " pod="openstack/nova-cell0-cell-mapping-m56xp" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.586318 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29e3998f-8e86-422d-bfdc-093018e4e311-scripts\") pod \"nova-cell0-cell-mapping-m56xp\" (UID: \"29e3998f-8e86-422d-bfdc-093018e4e311\") " pod="openstack/nova-cell0-cell-mapping-m56xp" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.586832 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29e3998f-8e86-422d-bfdc-093018e4e311-config-data\") pod \"nova-cell0-cell-mapping-m56xp\" (UID: \"29e3998f-8e86-422d-bfdc-093018e4e311\") " pod="openstack/nova-cell0-cell-mapping-m56xp" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.601585 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l64qw\" (UniqueName: \"kubernetes.io/projected/29e3998f-8e86-422d-bfdc-093018e4e311-kube-api-access-l64qw\") pod \"nova-cell0-cell-mapping-m56xp\" (UID: \"29e3998f-8e86-422d-bfdc-093018e4e311\") " pod="openstack/nova-cell0-cell-mapping-m56xp" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.699747 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.711640 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-m56xp" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.712215 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.726992 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.737090 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.788598 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxv4h\" (UniqueName: \"kubernetes.io/projected/5b4dcf15-5ff3-4cca-885b-1da4ea205dc2-kube-api-access-mxv4h\") pod \"nova-scheduler-0\" (UID: \"5b4dcf15-5ff3-4cca-885b-1da4ea205dc2\") " pod="openstack/nova-scheduler-0" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.788670 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b4dcf15-5ff3-4cca-885b-1da4ea205dc2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5b4dcf15-5ff3-4cca-885b-1da4ea205dc2\") " pod="openstack/nova-scheduler-0" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.788950 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b4dcf15-5ff3-4cca-885b-1da4ea205dc2-config-data\") pod \"nova-scheduler-0\" (UID: \"5b4dcf15-5ff3-4cca-885b-1da4ea205dc2\") " pod="openstack/nova-scheduler-0" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.842626 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.845155 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.850006 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.870963 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.874046 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.888814 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.889812 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.893252 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99cd5\" (UniqueName: \"kubernetes.io/projected/79499b4b-4312-4031-913b-4aef8bc0a47b-kube-api-access-99cd5\") pod \"nova-metadata-0\" (UID: \"79499b4b-4312-4031-913b-4aef8bc0a47b\") " pod="openstack/nova-metadata-0" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.893288 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79499b4b-4312-4031-913b-4aef8bc0a47b-logs\") pod \"nova-metadata-0\" (UID: \"79499b4b-4312-4031-913b-4aef8bc0a47b\") " pod="openstack/nova-metadata-0" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.893366 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79499b4b-4312-4031-913b-4aef8bc0a47b-config-data\") pod \"nova-metadata-0\" (UID: \"79499b4b-4312-4031-913b-4aef8bc0a47b\") " pod="openstack/nova-metadata-0" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.893469 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b4dcf15-5ff3-4cca-885b-1da4ea205dc2-config-data\") pod \"nova-scheduler-0\" (UID: \"5b4dcf15-5ff3-4cca-885b-1da4ea205dc2\") " pod="openstack/nova-scheduler-0" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.893547 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79499b4b-4312-4031-913b-4aef8bc0a47b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"79499b4b-4312-4031-913b-4aef8bc0a47b\") " pod="openstack/nova-metadata-0" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.893748 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxv4h\" (UniqueName: \"kubernetes.io/projected/5b4dcf15-5ff3-4cca-885b-1da4ea205dc2-kube-api-access-mxv4h\") pod \"nova-scheduler-0\" (UID: \"5b4dcf15-5ff3-4cca-885b-1da4ea205dc2\") " pod="openstack/nova-scheduler-0" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.895413 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b4dcf15-5ff3-4cca-885b-1da4ea205dc2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5b4dcf15-5ff3-4cca-885b-1da4ea205dc2\") " pod="openstack/nova-scheduler-0" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.899301 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b4dcf15-5ff3-4cca-885b-1da4ea205dc2-config-data\") pod \"nova-scheduler-0\" (UID: \"5b4dcf15-5ff3-4cca-885b-1da4ea205dc2\") " pod="openstack/nova-scheduler-0" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.906379 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b4dcf15-5ff3-4cca-885b-1da4ea205dc2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5b4dcf15-5ff3-4cca-885b-1da4ea205dc2\") " pod="openstack/nova-scheduler-0" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.919159 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.946562 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxv4h\" (UniqueName: \"kubernetes.io/projected/5b4dcf15-5ff3-4cca-885b-1da4ea205dc2-kube-api-access-mxv4h\") pod \"nova-scheduler-0\" (UID: \"5b4dcf15-5ff3-4cca-885b-1da4ea205dc2\") " pod="openstack/nova-scheduler-0" Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.954746 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-btct4"] Mar 18 15:59:40 crc kubenswrapper[4696]: I0318 15:59:40.970735 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-btct4" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.017744 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e93819a-4d0e-4261-a89c-333f1558c5e6-config-data\") pod \"nova-api-0\" (UID: \"8e93819a-4d0e-4261-a89c-333f1558c5e6\") " pod="openstack/nova-api-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.017864 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drsrd\" (UniqueName: \"kubernetes.io/projected/04c5761b-8f82-46b1-903c-49c1145516a0-kube-api-access-drsrd\") pod \"dnsmasq-dns-bccf8f775-btct4\" (UID: \"04c5761b-8f82-46b1-903c-49c1145516a0\") " pod="openstack/dnsmasq-dns-bccf8f775-btct4" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.017910 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-btct4\" (UID: \"04c5761b-8f82-46b1-903c-49c1145516a0\") " pod="openstack/dnsmasq-dns-bccf8f775-btct4" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.018076 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99cd5\" (UniqueName: \"kubernetes.io/projected/79499b4b-4312-4031-913b-4aef8bc0a47b-kube-api-access-99cd5\") pod \"nova-metadata-0\" (UID: \"79499b4b-4312-4031-913b-4aef8bc0a47b\") " pod="openstack/nova-metadata-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.018105 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79499b4b-4312-4031-913b-4aef8bc0a47b-logs\") pod \"nova-metadata-0\" (UID: \"79499b4b-4312-4031-913b-4aef8bc0a47b\") " pod="openstack/nova-metadata-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.018221 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79499b4b-4312-4031-913b-4aef8bc0a47b-config-data\") pod \"nova-metadata-0\" (UID: \"79499b4b-4312-4031-913b-4aef8bc0a47b\") " pod="openstack/nova-metadata-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.018282 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-config\") pod \"dnsmasq-dns-bccf8f775-btct4\" (UID: \"04c5761b-8f82-46b1-903c-49c1145516a0\") " pod="openstack/dnsmasq-dns-bccf8f775-btct4" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.018434 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e93819a-4d0e-4261-a89c-333f1558c5e6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8e93819a-4d0e-4261-a89c-333f1558c5e6\") " pod="openstack/nova-api-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.018471 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e93819a-4d0e-4261-a89c-333f1558c5e6-logs\") pod \"nova-api-0\" (UID: \"8e93819a-4d0e-4261-a89c-333f1558c5e6\") " pod="openstack/nova-api-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.018549 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79499b4b-4312-4031-913b-4aef8bc0a47b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"79499b4b-4312-4031-913b-4aef8bc0a47b\") " pod="openstack/nova-metadata-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.018597 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-dns-svc\") pod \"dnsmasq-dns-bccf8f775-btct4\" (UID: \"04c5761b-8f82-46b1-903c-49c1145516a0\") " pod="openstack/dnsmasq-dns-bccf8f775-btct4" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.018619 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgp2d\" (UniqueName: \"kubernetes.io/projected/8e93819a-4d0e-4261-a89c-333f1558c5e6-kube-api-access-jgp2d\") pod \"nova-api-0\" (UID: \"8e93819a-4d0e-4261-a89c-333f1558c5e6\") " pod="openstack/nova-api-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.018668 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-btct4\" (UID: \"04c5761b-8f82-46b1-903c-49c1145516a0\") " pod="openstack/dnsmasq-dns-bccf8f775-btct4" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.018745 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-btct4\" (UID: \"04c5761b-8f82-46b1-903c-49c1145516a0\") " pod="openstack/dnsmasq-dns-bccf8f775-btct4" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.019755 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79499b4b-4312-4031-913b-4aef8bc0a47b-logs\") pod \"nova-metadata-0\" (UID: \"79499b4b-4312-4031-913b-4aef8bc0a47b\") " pod="openstack/nova-metadata-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.043858 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79499b4b-4312-4031-913b-4aef8bc0a47b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"79499b4b-4312-4031-913b-4aef8bc0a47b\") " pod="openstack/nova-metadata-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.060307 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99cd5\" (UniqueName: \"kubernetes.io/projected/79499b4b-4312-4031-913b-4aef8bc0a47b-kube-api-access-99cd5\") pod \"nova-metadata-0\" (UID: \"79499b4b-4312-4031-913b-4aef8bc0a47b\") " pod="openstack/nova-metadata-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.081458 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79499b4b-4312-4031-913b-4aef8bc0a47b-config-data\") pod \"nova-metadata-0\" (UID: \"79499b4b-4312-4031-913b-4aef8bc0a47b\") " pod="openstack/nova-metadata-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.083391 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.105594 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-btct4"] Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.125978 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-dns-svc\") pod \"dnsmasq-dns-bccf8f775-btct4\" (UID: \"04c5761b-8f82-46b1-903c-49c1145516a0\") " pod="openstack/dnsmasq-dns-bccf8f775-btct4" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.126046 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgp2d\" (UniqueName: \"kubernetes.io/projected/8e93819a-4d0e-4261-a89c-333f1558c5e6-kube-api-access-jgp2d\") pod \"nova-api-0\" (UID: \"8e93819a-4d0e-4261-a89c-333f1558c5e6\") " pod="openstack/nova-api-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.126221 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-btct4\" (UID: \"04c5761b-8f82-46b1-903c-49c1145516a0\") " pod="openstack/dnsmasq-dns-bccf8f775-btct4" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.126313 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-btct4\" (UID: \"04c5761b-8f82-46b1-903c-49c1145516a0\") " pod="openstack/dnsmasq-dns-bccf8f775-btct4" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.126419 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e93819a-4d0e-4261-a89c-333f1558c5e6-config-data\") pod \"nova-api-0\" (UID: \"8e93819a-4d0e-4261-a89c-333f1558c5e6\") " pod="openstack/nova-api-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.128853 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drsrd\" (UniqueName: \"kubernetes.io/projected/04c5761b-8f82-46b1-903c-49c1145516a0-kube-api-access-drsrd\") pod \"dnsmasq-dns-bccf8f775-btct4\" (UID: \"04c5761b-8f82-46b1-903c-49c1145516a0\") " pod="openstack/dnsmasq-dns-bccf8f775-btct4" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.128903 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-btct4\" (UID: \"04c5761b-8f82-46b1-903c-49c1145516a0\") " pod="openstack/dnsmasq-dns-bccf8f775-btct4" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.130012 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-config\") pod \"dnsmasq-dns-bccf8f775-btct4\" (UID: \"04c5761b-8f82-46b1-903c-49c1145516a0\") " pod="openstack/dnsmasq-dns-bccf8f775-btct4" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.130245 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e93819a-4d0e-4261-a89c-333f1558c5e6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8e93819a-4d0e-4261-a89c-333f1558c5e6\") " pod="openstack/nova-api-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.130308 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e93819a-4d0e-4261-a89c-333f1558c5e6-logs\") pod \"nova-api-0\" (UID: \"8e93819a-4d0e-4261-a89c-333f1558c5e6\") " pod="openstack/nova-api-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.131280 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e93819a-4d0e-4261-a89c-333f1558c5e6-logs\") pod \"nova-api-0\" (UID: \"8e93819a-4d0e-4261-a89c-333f1558c5e6\") " pod="openstack/nova-api-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.131571 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-btct4\" (UID: \"04c5761b-8f82-46b1-903c-49c1145516a0\") " pod="openstack/dnsmasq-dns-bccf8f775-btct4" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.133102 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-config\") pod \"dnsmasq-dns-bccf8f775-btct4\" (UID: \"04c5761b-8f82-46b1-903c-49c1145516a0\") " pod="openstack/dnsmasq-dns-bccf8f775-btct4" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.133727 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-btct4\" (UID: \"04c5761b-8f82-46b1-903c-49c1145516a0\") " pod="openstack/dnsmasq-dns-bccf8f775-btct4" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.135739 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-btct4\" (UID: \"04c5761b-8f82-46b1-903c-49c1145516a0\") " pod="openstack/dnsmasq-dns-bccf8f775-btct4" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.136044 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-dns-svc\") pod \"dnsmasq-dns-bccf8f775-btct4\" (UID: \"04c5761b-8f82-46b1-903c-49c1145516a0\") " pod="openstack/dnsmasq-dns-bccf8f775-btct4" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.138703 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e93819a-4d0e-4261-a89c-333f1558c5e6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8e93819a-4d0e-4261-a89c-333f1558c5e6\") " pod="openstack/nova-api-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.153462 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgp2d\" (UniqueName: \"kubernetes.io/projected/8e93819a-4d0e-4261-a89c-333f1558c5e6-kube-api-access-jgp2d\") pod \"nova-api-0\" (UID: \"8e93819a-4d0e-4261-a89c-333f1558c5e6\") " pod="openstack/nova-api-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.160179 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e93819a-4d0e-4261-a89c-333f1558c5e6-config-data\") pod \"nova-api-0\" (UID: \"8e93819a-4d0e-4261-a89c-333f1558c5e6\") " pod="openstack/nova-api-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.167719 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.168227 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drsrd\" (UniqueName: \"kubernetes.io/projected/04c5761b-8f82-46b1-903c-49c1145516a0-kube-api-access-drsrd\") pod \"dnsmasq-dns-bccf8f775-btct4\" (UID: \"04c5761b-8f82-46b1-903c-49c1145516a0\") " pod="openstack/dnsmasq-dns-bccf8f775-btct4" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.171067 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.174849 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.182546 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.198826 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.232570 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/008931c6-5d36-4eea-954c-0f583bde3955-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"008931c6-5d36-4eea-954c-0f583bde3955\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.232758 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/008931c6-5d36-4eea-954c-0f583bde3955-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"008931c6-5d36-4eea-954c-0f583bde3955\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.232785 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-769jc\" (UniqueName: \"kubernetes.io/projected/008931c6-5d36-4eea-954c-0f583bde3955-kube-api-access-769jc\") pod \"nova-cell1-novncproxy-0\" (UID: \"008931c6-5d36-4eea-954c-0f583bde3955\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.303895 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.319092 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-btct4" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.335408 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/008931c6-5d36-4eea-954c-0f583bde3955-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"008931c6-5d36-4eea-954c-0f583bde3955\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.335485 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-769jc\" (UniqueName: \"kubernetes.io/projected/008931c6-5d36-4eea-954c-0f583bde3955-kube-api-access-769jc\") pod \"nova-cell1-novncproxy-0\" (UID: \"008931c6-5d36-4eea-954c-0f583bde3955\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.335614 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/008931c6-5d36-4eea-954c-0f583bde3955-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"008931c6-5d36-4eea-954c-0f583bde3955\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.339889 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/008931c6-5d36-4eea-954c-0f583bde3955-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"008931c6-5d36-4eea-954c-0f583bde3955\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.341002 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/008931c6-5d36-4eea-954c-0f583bde3955-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"008931c6-5d36-4eea-954c-0f583bde3955\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.360043 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-769jc\" (UniqueName: \"kubernetes.io/projected/008931c6-5d36-4eea-954c-0f583bde3955-kube-api-access-769jc\") pod \"nova-cell1-novncproxy-0\" (UID: \"008931c6-5d36-4eea-954c-0f583bde3955\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.502019 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.591281 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.650581 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n96vg\" (UniqueName: \"kubernetes.io/projected/40ba5312-af64-42b5-8f85-cd32aa1dd530-kube-api-access-n96vg\") pod \"40ba5312-af64-42b5-8f85-cd32aa1dd530\" (UID: \"40ba5312-af64-42b5-8f85-cd32aa1dd530\") " Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.670660 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40ba5312-af64-42b5-8f85-cd32aa1dd530-kube-api-access-n96vg" (OuterVolumeSpecName: "kube-api-access-n96vg") pod "40ba5312-af64-42b5-8f85-cd32aa1dd530" (UID: "40ba5312-af64-42b5-8f85-cd32aa1dd530"). InnerVolumeSpecName "kube-api-access-n96vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:41 crc kubenswrapper[4696]: I0318 15:59:41.754182 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n96vg\" (UniqueName: \"kubernetes.io/projected/40ba5312-af64-42b5-8f85-cd32aa1dd530-kube-api-access-n96vg\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.103186 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2cpbx"] Mar 18 15:59:42 crc kubenswrapper[4696]: E0318 15:59:42.104063 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40ba5312-af64-42b5-8f85-cd32aa1dd530" containerName="kube-state-metrics" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.104205 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="40ba5312-af64-42b5-8f85-cd32aa1dd530" containerName="kube-state-metrics" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.104580 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="40ba5312-af64-42b5-8f85-cd32aa1dd530" containerName="kube-state-metrics" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.105793 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2cpbx" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.108306 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.109341 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.118927 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2cpbx"] Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.162054 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c1a18ae-c261-421e-907b-e3bb372199a2-config-data\") pod \"nova-cell1-conductor-db-sync-2cpbx\" (UID: \"8c1a18ae-c261-421e-907b-e3bb372199a2\") " pod="openstack/nova-cell1-conductor-db-sync-2cpbx" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.162459 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxv5g\" (UniqueName: \"kubernetes.io/projected/8c1a18ae-c261-421e-907b-e3bb372199a2-kube-api-access-gxv5g\") pod \"nova-cell1-conductor-db-sync-2cpbx\" (UID: \"8c1a18ae-c261-421e-907b-e3bb372199a2\") " pod="openstack/nova-cell1-conductor-db-sync-2cpbx" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.162510 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c1a18ae-c261-421e-907b-e3bb372199a2-scripts\") pod \"nova-cell1-conductor-db-sync-2cpbx\" (UID: \"8c1a18ae-c261-421e-907b-e3bb372199a2\") " pod="openstack/nova-cell1-conductor-db-sync-2cpbx" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.162605 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1a18ae-c261-421e-907b-e3bb372199a2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2cpbx\" (UID: \"8c1a18ae-c261-421e-907b-e3bb372199a2\") " pod="openstack/nova-cell1-conductor-db-sync-2cpbx" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.184648 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.184726 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.184792 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.185859 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e7d6aeeccf3f0ce1fb7410ba06c6597ef8535cab4338e38adaf8fc42a5797086"} pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.185931 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" containerID="cri-o://e7d6aeeccf3f0ce1fb7410ba06c6597ef8535cab4338e38adaf8fc42a5797086" gracePeriod=600 Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.188147 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.190692 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="580fa059-673f-4493-a9b7-25f279ab30fe" containerName="ceilometer-central-agent" containerID="cri-o://2e4f5068ec13d822dec65e32bc1da8cb76fd1b106ab7d66cb2ef4152de3ca9ad" gracePeriod=30 Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.190996 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="580fa059-673f-4493-a9b7-25f279ab30fe" containerName="proxy-httpd" containerID="cri-o://d33d9fc55f27949c91f38ca64dd04a870bbfb20d76a75555d56cc2fc6de7748e" gracePeriod=30 Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.191061 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="580fa059-673f-4493-a9b7-25f279ab30fe" containerName="sg-core" containerID="cri-o://2659d06d12847d6552344240e8d58c32487edd7bd75913daadf21487c5b9ac4d" gracePeriod=30 Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.191124 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="580fa059-673f-4493-a9b7-25f279ab30fe" containerName="ceilometer-notification-agent" containerID="cri-o://ac66f19e992c707a33ebac8974904793255c389ce738f37dc2a1185df221d568" gracePeriod=30 Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.265060 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1a18ae-c261-421e-907b-e3bb372199a2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2cpbx\" (UID: \"8c1a18ae-c261-421e-907b-e3bb372199a2\") " pod="openstack/nova-cell1-conductor-db-sync-2cpbx" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.265199 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c1a18ae-c261-421e-907b-e3bb372199a2-config-data\") pod \"nova-cell1-conductor-db-sync-2cpbx\" (UID: \"8c1a18ae-c261-421e-907b-e3bb372199a2\") " pod="openstack/nova-cell1-conductor-db-sync-2cpbx" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.265264 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxv5g\" (UniqueName: \"kubernetes.io/projected/8c1a18ae-c261-421e-907b-e3bb372199a2-kube-api-access-gxv5g\") pod \"nova-cell1-conductor-db-sync-2cpbx\" (UID: \"8c1a18ae-c261-421e-907b-e3bb372199a2\") " pod="openstack/nova-cell1-conductor-db-sync-2cpbx" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.265337 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c1a18ae-c261-421e-907b-e3bb372199a2-scripts\") pod \"nova-cell1-conductor-db-sync-2cpbx\" (UID: \"8c1a18ae-c261-421e-907b-e3bb372199a2\") " pod="openstack/nova-cell1-conductor-db-sync-2cpbx" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.271115 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c1a18ae-c261-421e-907b-e3bb372199a2-scripts\") pod \"nova-cell1-conductor-db-sync-2cpbx\" (UID: \"8c1a18ae-c261-421e-907b-e3bb372199a2\") " pod="openstack/nova-cell1-conductor-db-sync-2cpbx" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.290722 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c1a18ae-c261-421e-907b-e3bb372199a2-config-data\") pod \"nova-cell1-conductor-db-sync-2cpbx\" (UID: \"8c1a18ae-c261-421e-907b-e3bb372199a2\") " pod="openstack/nova-cell1-conductor-db-sync-2cpbx" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.290979 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1a18ae-c261-421e-907b-e3bb372199a2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2cpbx\" (UID: \"8c1a18ae-c261-421e-907b-e3bb372199a2\") " pod="openstack/nova-cell1-conductor-db-sync-2cpbx" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.298051 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"40ba5312-af64-42b5-8f85-cd32aa1dd530","Type":"ContainerDied","Data":"33bc8c6a97d220f2ed2f335b0f03769321e0139e0f9e8d0b5cb7a6ff1c8759c5"} Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.298220 4696 scope.go:117] "RemoveContainer" containerID="6555ec005e32e3316a2e560ccc156e6862b4dff3d46764f1c5215afb69a2b996" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.298557 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.302019 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxv5g\" (UniqueName: \"kubernetes.io/projected/8c1a18ae-c261-421e-907b-e3bb372199a2-kube-api-access-gxv5g\") pod \"nova-cell1-conductor-db-sync-2cpbx\" (UID: \"8c1a18ae-c261-421e-907b-e3bb372199a2\") " pod="openstack/nova-cell1-conductor-db-sync-2cpbx" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.428365 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2cpbx" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.430309 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.438936 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.449215 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.451156 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.455983 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.455983 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.459370 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.581372 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b790582-ecd0-41b7-8f9c-f0ef9d2415db-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6b790582-ecd0-41b7-8f9c-f0ef9d2415db\") " pod="openstack/kube-state-metrics-0" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.581613 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b790582-ecd0-41b7-8f9c-f0ef9d2415db-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6b790582-ecd0-41b7-8f9c-f0ef9d2415db\") " pod="openstack/kube-state-metrics-0" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.581833 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9g96\" (UniqueName: \"kubernetes.io/projected/6b790582-ecd0-41b7-8f9c-f0ef9d2415db-kube-api-access-g9g96\") pod \"kube-state-metrics-0\" (UID: \"6b790582-ecd0-41b7-8f9c-f0ef9d2415db\") " pod="openstack/kube-state-metrics-0" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.582027 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6b790582-ecd0-41b7-8f9c-f0ef9d2415db-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6b790582-ecd0-41b7-8f9c-f0ef9d2415db\") " pod="openstack/kube-state-metrics-0" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.684169 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b790582-ecd0-41b7-8f9c-f0ef9d2415db-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6b790582-ecd0-41b7-8f9c-f0ef9d2415db\") " pod="openstack/kube-state-metrics-0" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.684615 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b790582-ecd0-41b7-8f9c-f0ef9d2415db-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6b790582-ecd0-41b7-8f9c-f0ef9d2415db\") " pod="openstack/kube-state-metrics-0" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.684730 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9g96\" (UniqueName: \"kubernetes.io/projected/6b790582-ecd0-41b7-8f9c-f0ef9d2415db-kube-api-access-g9g96\") pod \"kube-state-metrics-0\" (UID: \"6b790582-ecd0-41b7-8f9c-f0ef9d2415db\") " pod="openstack/kube-state-metrics-0" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.684836 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6b790582-ecd0-41b7-8f9c-f0ef9d2415db-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6b790582-ecd0-41b7-8f9c-f0ef9d2415db\") " pod="openstack/kube-state-metrics-0" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.697825 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6b790582-ecd0-41b7-8f9c-f0ef9d2415db-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6b790582-ecd0-41b7-8f9c-f0ef9d2415db\") " pod="openstack/kube-state-metrics-0" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.699362 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b790582-ecd0-41b7-8f9c-f0ef9d2415db-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6b790582-ecd0-41b7-8f9c-f0ef9d2415db\") " pod="openstack/kube-state-metrics-0" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.707531 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b790582-ecd0-41b7-8f9c-f0ef9d2415db-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6b790582-ecd0-41b7-8f9c-f0ef9d2415db\") " pod="openstack/kube-state-metrics-0" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.725539 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9g96\" (UniqueName: \"kubernetes.io/projected/6b790582-ecd0-41b7-8f9c-f0ef9d2415db-kube-api-access-g9g96\") pod \"kube-state-metrics-0\" (UID: \"6b790582-ecd0-41b7-8f9c-f0ef9d2415db\") " pod="openstack/kube-state-metrics-0" Mar 18 15:59:42 crc kubenswrapper[4696]: I0318 15:59:42.861502 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 15:59:43 crc kubenswrapper[4696]: I0318 15:59:43.047243 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 15:59:43 crc kubenswrapper[4696]: I0318 15:59:43.319370 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"008931c6-5d36-4eea-954c-0f583bde3955","Type":"ContainerStarted","Data":"f4495eb25b493f41d161d31e08dab5613dc262476a48f8a68e2eda0d239f9c46"} Mar 18 15:59:43 crc kubenswrapper[4696]: I0318 15:59:43.327005 4696 generic.go:334] "Generic (PLEG): container finished" podID="580fa059-673f-4493-a9b7-25f279ab30fe" containerID="d33d9fc55f27949c91f38ca64dd04a870bbfb20d76a75555d56cc2fc6de7748e" exitCode=0 Mar 18 15:59:43 crc kubenswrapper[4696]: I0318 15:59:43.327369 4696 generic.go:334] "Generic (PLEG): container finished" podID="580fa059-673f-4493-a9b7-25f279ab30fe" containerID="2659d06d12847d6552344240e8d58c32487edd7bd75913daadf21487c5b9ac4d" exitCode=2 Mar 18 15:59:43 crc kubenswrapper[4696]: I0318 15:59:43.327378 4696 generic.go:334] "Generic (PLEG): container finished" podID="580fa059-673f-4493-a9b7-25f279ab30fe" containerID="2e4f5068ec13d822dec65e32bc1da8cb76fd1b106ab7d66cb2ef4152de3ca9ad" exitCode=0 Mar 18 15:59:43 crc kubenswrapper[4696]: I0318 15:59:43.327087 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"580fa059-673f-4493-a9b7-25f279ab30fe","Type":"ContainerDied","Data":"d33d9fc55f27949c91f38ca64dd04a870bbfb20d76a75555d56cc2fc6de7748e"} Mar 18 15:59:43 crc kubenswrapper[4696]: I0318 15:59:43.327491 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"580fa059-673f-4493-a9b7-25f279ab30fe","Type":"ContainerDied","Data":"2659d06d12847d6552344240e8d58c32487edd7bd75913daadf21487c5b9ac4d"} Mar 18 15:59:43 crc kubenswrapper[4696]: I0318 15:59:43.327513 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"580fa059-673f-4493-a9b7-25f279ab30fe","Type":"ContainerDied","Data":"2e4f5068ec13d822dec65e32bc1da8cb76fd1b106ab7d66cb2ef4152de3ca9ad"} Mar 18 15:59:43 crc kubenswrapper[4696]: I0318 15:59:43.331816 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mvr4r" event={"ID":"0644b4b4-173e-45a4-a79d-0127945fbf38","Type":"ContainerStarted","Data":"03d3b73ff93f3d6e01e9d320cce255d3157293f9310b8e582cb8fe903aa921f4"} Mar 18 15:59:43 crc kubenswrapper[4696]: I0318 15:59:43.338098 4696 generic.go:334] "Generic (PLEG): container finished" podID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerID="e7d6aeeccf3f0ce1fb7410ba06c6597ef8535cab4338e38adaf8fc42a5797086" exitCode=0 Mar 18 15:59:43 crc kubenswrapper[4696]: I0318 15:59:43.338171 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerDied","Data":"e7d6aeeccf3f0ce1fb7410ba06c6597ef8535cab4338e38adaf8fc42a5797086"} Mar 18 15:59:43 crc kubenswrapper[4696]: I0318 15:59:43.338210 4696 scope.go:117] "RemoveContainer" containerID="859c999f0d34d60bd36ffb5138cc29b3e68983a12c55849818cd9411add0b7fd" Mar 18 15:59:43 crc kubenswrapper[4696]: I0318 15:59:43.355417 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mvr4r" podStartSLOduration=3.272816829 podStartE2EDuration="14.355395864s" podCreationTimestamp="2026-03-18 15:59:29 +0000 UTC" firstStartedPulling="2026-03-18 15:59:31.075215261 +0000 UTC m=+1414.081389497" lastFinishedPulling="2026-03-18 15:59:42.157794326 +0000 UTC m=+1425.163968532" observedRunningTime="2026-03-18 15:59:43.355276121 +0000 UTC m=+1426.361450327" watchObservedRunningTime="2026-03-18 15:59:43.355395864 +0000 UTC m=+1426.361570060" Mar 18 15:59:43 crc kubenswrapper[4696]: W0318 15:59:43.501280 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e93819a_4d0e_4261_a89c_333f1558c5e6.slice/crio-5b586bdb2d59f485e87a15c417255d50739c0b8bd804fe96ae89f4b4f35fd051 WatchSource:0}: Error finding container 5b586bdb2d59f485e87a15c417255d50739c0b8bd804fe96ae89f4b4f35fd051: Status 404 returned error can't find the container with id 5b586bdb2d59f485e87a15c417255d50739c0b8bd804fe96ae89f4b4f35fd051 Mar 18 15:59:43 crc kubenswrapper[4696]: I0318 15:59:43.518397 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 15:59:43 crc kubenswrapper[4696]: I0318 15:59:43.615830 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40ba5312-af64-42b5-8f85-cd32aa1dd530" path="/var/lib/kubelet/pods/40ba5312-af64-42b5-8f85-cd32aa1dd530/volumes" Mar 18 15:59:43 crc kubenswrapper[4696]: I0318 15:59:43.617016 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-btct4"] Mar 18 15:59:43 crc kubenswrapper[4696]: I0318 15:59:43.617158 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 15:59:43 crc kubenswrapper[4696]: I0318 15:59:43.631413 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2cpbx"] Mar 18 15:59:43 crc kubenswrapper[4696]: I0318 15:59:43.646622 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-m56xp"] Mar 18 15:59:43 crc kubenswrapper[4696]: I0318 15:59:43.726100 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 15:59:43 crc kubenswrapper[4696]: W0318 15:59:43.742835 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04c5761b_8f82_46b1_903c_49c1145516a0.slice/crio-2680be57c4c9f71ff8799d3427d714811eb46af4ed6889b1254f81e35c6d123e WatchSource:0}: Error finding container 2680be57c4c9f71ff8799d3427d714811eb46af4ed6889b1254f81e35c6d123e: Status 404 returned error can't find the container with id 2680be57c4c9f71ff8799d3427d714811eb46af4ed6889b1254f81e35c6d123e Mar 18 15:59:43 crc kubenswrapper[4696]: W0318 15:59:43.757709 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b790582_ecd0_41b7_8f9c_f0ef9d2415db.slice/crio-04b938aed6705703cc692f47db913b3ee2d2bef23f8533ff780bdae5c786a97a WatchSource:0}: Error finding container 04b938aed6705703cc692f47db913b3ee2d2bef23f8533ff780bdae5c786a97a: Status 404 returned error can't find the container with id 04b938aed6705703cc692f47db913b3ee2d2bef23f8533ff780bdae5c786a97a Mar 18 15:59:43 crc kubenswrapper[4696]: I0318 15:59:43.760570 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 15:59:44 crc kubenswrapper[4696]: I0318 15:59:44.354458 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2cpbx" event={"ID":"8c1a18ae-c261-421e-907b-e3bb372199a2","Type":"ContainerStarted","Data":"07a759efca230d60b28d43e4356e6c2bde4ee5c4ec19228fcf757e698af1980a"} Mar 18 15:59:44 crc kubenswrapper[4696]: I0318 15:59:44.355857 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79499b4b-4312-4031-913b-4aef8bc0a47b","Type":"ContainerStarted","Data":"5e5511ba3762d84bb4c3930c704bb7f394841b1b5ec4431c38f5e5f3067d883a"} Mar 18 15:59:44 crc kubenswrapper[4696]: I0318 15:59:44.357049 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6b790582-ecd0-41b7-8f9c-f0ef9d2415db","Type":"ContainerStarted","Data":"04b938aed6705703cc692f47db913b3ee2d2bef23f8533ff780bdae5c786a97a"} Mar 18 15:59:44 crc kubenswrapper[4696]: I0318 15:59:44.359387 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5b4dcf15-5ff3-4cca-885b-1da4ea205dc2","Type":"ContainerStarted","Data":"dbc916981183c7ed95eb4d8cfd455cfdb4ea17aae532e1d9a3a157d3f8a05ab5"} Mar 18 15:59:44 crc kubenswrapper[4696]: I0318 15:59:44.360732 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-btct4" event={"ID":"04c5761b-8f82-46b1-903c-49c1145516a0","Type":"ContainerStarted","Data":"2680be57c4c9f71ff8799d3427d714811eb46af4ed6889b1254f81e35c6d123e"} Mar 18 15:59:44 crc kubenswrapper[4696]: I0318 15:59:44.361983 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8e93819a-4d0e-4261-a89c-333f1558c5e6","Type":"ContainerStarted","Data":"5b586bdb2d59f485e87a15c417255d50739c0b8bd804fe96ae89f4b4f35fd051"} Mar 18 15:59:44 crc kubenswrapper[4696]: I0318 15:59:44.363062 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-m56xp" event={"ID":"29e3998f-8e86-422d-bfdc-093018e4e311","Type":"ContainerStarted","Data":"13d69e151a6af5a010c98428ee5d93f118223f32f844dabeec52fa42d63c2f5b"} Mar 18 15:59:44 crc kubenswrapper[4696]: I0318 15:59:44.367414 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerStarted","Data":"aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9"} Mar 18 15:59:44 crc kubenswrapper[4696]: I0318 15:59:44.970741 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.001139 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.397892 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2cpbx" event={"ID":"8c1a18ae-c261-421e-907b-e3bb372199a2","Type":"ContainerStarted","Data":"5b6295e623d9882ee5c1a781ea10e6bca407d26b413bc2def8ab58f02cc88f98"} Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.413846 4696 generic.go:334] "Generic (PLEG): container finished" podID="580fa059-673f-4493-a9b7-25f279ab30fe" containerID="ac66f19e992c707a33ebac8974904793255c389ce738f37dc2a1185df221d568" exitCode=0 Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.413940 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"580fa059-673f-4493-a9b7-25f279ab30fe","Type":"ContainerDied","Data":"ac66f19e992c707a33ebac8974904793255c389ce738f37dc2a1185df221d568"} Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.422030 4696 generic.go:334] "Generic (PLEG): container finished" podID="04c5761b-8f82-46b1-903c-49c1145516a0" containerID="965d06eab2335d06b80091056473ffd23c9252c3b119f1dd46631d4cbce43b45" exitCode=0 Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.422143 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-btct4" event={"ID":"04c5761b-8f82-46b1-903c-49c1145516a0","Type":"ContainerDied","Data":"965d06eab2335d06b80091056473ffd23c9252c3b119f1dd46631d4cbce43b45"} Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.434202 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-m56xp" event={"ID":"29e3998f-8e86-422d-bfdc-093018e4e311","Type":"ContainerStarted","Data":"d92ae8dabbaaa98864ddff9e17467d0e4796962852a20add4f2473e5772709b1"} Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.445100 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-2cpbx" podStartSLOduration=3.445072335 podStartE2EDuration="3.445072335s" podCreationTimestamp="2026-03-18 15:59:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:59:45.437404313 +0000 UTC m=+1428.443578519" watchObservedRunningTime="2026-03-18 15:59:45.445072335 +0000 UTC m=+1428.451246541" Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.471105 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-m56xp" podStartSLOduration=5.471078426 podStartE2EDuration="5.471078426s" podCreationTimestamp="2026-03-18 15:59:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:59:45.470457201 +0000 UTC m=+1428.476631397" watchObservedRunningTime="2026-03-18 15:59:45.471078426 +0000 UTC m=+1428.477252632" Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.750280 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.810964 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/580fa059-673f-4493-a9b7-25f279ab30fe-config-data\") pod \"580fa059-673f-4493-a9b7-25f279ab30fe\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.811157 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/580fa059-673f-4493-a9b7-25f279ab30fe-run-httpd\") pod \"580fa059-673f-4493-a9b7-25f279ab30fe\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.811294 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2d5j\" (UniqueName: \"kubernetes.io/projected/580fa059-673f-4493-a9b7-25f279ab30fe-kube-api-access-s2d5j\") pod \"580fa059-673f-4493-a9b7-25f279ab30fe\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.811318 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/580fa059-673f-4493-a9b7-25f279ab30fe-log-httpd\") pod \"580fa059-673f-4493-a9b7-25f279ab30fe\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.811367 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/580fa059-673f-4493-a9b7-25f279ab30fe-sg-core-conf-yaml\") pod \"580fa059-673f-4493-a9b7-25f279ab30fe\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.811403 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/580fa059-673f-4493-a9b7-25f279ab30fe-combined-ca-bundle\") pod \"580fa059-673f-4493-a9b7-25f279ab30fe\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.811477 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/580fa059-673f-4493-a9b7-25f279ab30fe-scripts\") pod \"580fa059-673f-4493-a9b7-25f279ab30fe\" (UID: \"580fa059-673f-4493-a9b7-25f279ab30fe\") " Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.813872 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/580fa059-673f-4493-a9b7-25f279ab30fe-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "580fa059-673f-4493-a9b7-25f279ab30fe" (UID: "580fa059-673f-4493-a9b7-25f279ab30fe"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.815245 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/580fa059-673f-4493-a9b7-25f279ab30fe-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "580fa059-673f-4493-a9b7-25f279ab30fe" (UID: "580fa059-673f-4493-a9b7-25f279ab30fe"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.832282 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/580fa059-673f-4493-a9b7-25f279ab30fe-kube-api-access-s2d5j" (OuterVolumeSpecName: "kube-api-access-s2d5j") pod "580fa059-673f-4493-a9b7-25f279ab30fe" (UID: "580fa059-673f-4493-a9b7-25f279ab30fe"). InnerVolumeSpecName "kube-api-access-s2d5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.847785 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/580fa059-673f-4493-a9b7-25f279ab30fe-scripts" (OuterVolumeSpecName: "scripts") pod "580fa059-673f-4493-a9b7-25f279ab30fe" (UID: "580fa059-673f-4493-a9b7-25f279ab30fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.914511 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2d5j\" (UniqueName: \"kubernetes.io/projected/580fa059-673f-4493-a9b7-25f279ab30fe-kube-api-access-s2d5j\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.914571 4696 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/580fa059-673f-4493-a9b7-25f279ab30fe-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.914583 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/580fa059-673f-4493-a9b7-25f279ab30fe-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.914592 4696 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/580fa059-673f-4493-a9b7-25f279ab30fe-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.923767 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/580fa059-673f-4493-a9b7-25f279ab30fe-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "580fa059-673f-4493-a9b7-25f279ab30fe" (UID: "580fa059-673f-4493-a9b7-25f279ab30fe"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:45 crc kubenswrapper[4696]: I0318 15:59:45.982447 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/580fa059-673f-4493-a9b7-25f279ab30fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "580fa059-673f-4493-a9b7-25f279ab30fe" (UID: "580fa059-673f-4493-a9b7-25f279ab30fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.018246 4696 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/580fa059-673f-4493-a9b7-25f279ab30fe-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.018282 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/580fa059-673f-4493-a9b7-25f279ab30fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.064031 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/580fa059-673f-4493-a9b7-25f279ab30fe-config-data" (OuterVolumeSpecName: "config-data") pod "580fa059-673f-4493-a9b7-25f279ab30fe" (UID: "580fa059-673f-4493-a9b7-25f279ab30fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.122933 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/580fa059-673f-4493-a9b7-25f279ab30fe-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.450967 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"580fa059-673f-4493-a9b7-25f279ab30fe","Type":"ContainerDied","Data":"e3de8ee5fca8c9a660ae7ce53f7f2ebcbb7493cb32f15b80e9daffcceaad318c"} Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.451045 4696 scope.go:117] "RemoveContainer" containerID="d33d9fc55f27949c91f38ca64dd04a870bbfb20d76a75555d56cc2fc6de7748e" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.451254 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.470271 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-btct4" event={"ID":"04c5761b-8f82-46b1-903c-49c1145516a0","Type":"ContainerStarted","Data":"a31eb740c98c11bd1dbbe21b974e9398871d2793161b81fef1ff3d52d6c553f9"} Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.476311 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-btct4" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.508608 4696 scope.go:117] "RemoveContainer" containerID="2659d06d12847d6552344240e8d58c32487edd7bd75913daadf21487c5b9ac4d" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.524473 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-btct4" podStartSLOduration=6.52444272 podStartE2EDuration="6.52444272s" podCreationTimestamp="2026-03-18 15:59:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:59:46.511388692 +0000 UTC m=+1429.517562908" watchObservedRunningTime="2026-03-18 15:59:46.52444272 +0000 UTC m=+1429.530616936" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.547635 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.561478 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.598066 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:59:46 crc kubenswrapper[4696]: E0318 15:59:46.598650 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="580fa059-673f-4493-a9b7-25f279ab30fe" containerName="sg-core" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.598664 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="580fa059-673f-4493-a9b7-25f279ab30fe" containerName="sg-core" Mar 18 15:59:46 crc kubenswrapper[4696]: E0318 15:59:46.598682 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="580fa059-673f-4493-a9b7-25f279ab30fe" containerName="proxy-httpd" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.598688 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="580fa059-673f-4493-a9b7-25f279ab30fe" containerName="proxy-httpd" Mar 18 15:59:46 crc kubenswrapper[4696]: E0318 15:59:46.598716 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="580fa059-673f-4493-a9b7-25f279ab30fe" containerName="ceilometer-central-agent" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.598722 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="580fa059-673f-4493-a9b7-25f279ab30fe" containerName="ceilometer-central-agent" Mar 18 15:59:46 crc kubenswrapper[4696]: E0318 15:59:46.598728 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="580fa059-673f-4493-a9b7-25f279ab30fe" containerName="ceilometer-notification-agent" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.598738 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="580fa059-673f-4493-a9b7-25f279ab30fe" containerName="ceilometer-notification-agent" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.604008 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="580fa059-673f-4493-a9b7-25f279ab30fe" containerName="proxy-httpd" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.604051 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="580fa059-673f-4493-a9b7-25f279ab30fe" containerName="ceilometer-central-agent" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.604063 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="580fa059-673f-4493-a9b7-25f279ab30fe" containerName="ceilometer-notification-agent" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.604103 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="580fa059-673f-4493-a9b7-25f279ab30fe" containerName="sg-core" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.606384 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.614329 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.614391 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.614329 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.624671 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.740633 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdc7971b-ce0d-490b-b046-7be30738505a-log-httpd\") pod \"ceilometer-0\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.740699 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kq6p\" (UniqueName: \"kubernetes.io/projected/bdc7971b-ce0d-490b-b046-7be30738505a-kube-api-access-9kq6p\") pod \"ceilometer-0\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.740755 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.740821 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-scripts\") pod \"ceilometer-0\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.740838 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.740894 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdc7971b-ce0d-490b-b046-7be30738505a-run-httpd\") pod \"ceilometer-0\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.740941 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.740961 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-config-data\") pod \"ceilometer-0\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.842728 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-scripts\") pod \"ceilometer-0\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.842772 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.842841 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdc7971b-ce0d-490b-b046-7be30738505a-run-httpd\") pod \"ceilometer-0\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.842892 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.842911 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-config-data\") pod \"ceilometer-0\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.843061 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdc7971b-ce0d-490b-b046-7be30738505a-log-httpd\") pod \"ceilometer-0\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.843126 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kq6p\" (UniqueName: \"kubernetes.io/projected/bdc7971b-ce0d-490b-b046-7be30738505a-kube-api-access-9kq6p\") pod \"ceilometer-0\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.843182 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.843567 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdc7971b-ce0d-490b-b046-7be30738505a-run-httpd\") pod \"ceilometer-0\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.843595 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdc7971b-ce0d-490b-b046-7be30738505a-log-httpd\") pod \"ceilometer-0\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.849419 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.851855 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-scripts\") pod \"ceilometer-0\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.851911 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.852234 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.856333 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-config-data\") pod \"ceilometer-0\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " pod="openstack/ceilometer-0" Mar 18 15:59:46 crc kubenswrapper[4696]: I0318 15:59:46.867411 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kq6p\" (UniqueName: \"kubernetes.io/projected/bdc7971b-ce0d-490b-b046-7be30738505a-kube-api-access-9kq6p\") pod \"ceilometer-0\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " pod="openstack/ceilometer-0" Mar 18 15:59:47 crc kubenswrapper[4696]: I0318 15:59:47.004016 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 15:59:47 crc kubenswrapper[4696]: I0318 15:59:47.263177 4696 scope.go:117] "RemoveContainer" containerID="ac66f19e992c707a33ebac8974904793255c389ce738f37dc2a1185df221d568" Mar 18 15:59:47 crc kubenswrapper[4696]: I0318 15:59:47.626618 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="580fa059-673f-4493-a9b7-25f279ab30fe" path="/var/lib/kubelet/pods/580fa059-673f-4493-a9b7-25f279ab30fe/volumes" Mar 18 15:59:47 crc kubenswrapper[4696]: I0318 15:59:47.732795 4696 scope.go:117] "RemoveContainer" containerID="2e4f5068ec13d822dec65e32bc1da8cb76fd1b106ab7d66cb2ef4152de3ca9ad" Mar 18 15:59:48 crc kubenswrapper[4696]: I0318 15:59:48.197920 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:59:48 crc kubenswrapper[4696]: W0318 15:59:48.531852 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdc7971b_ce0d_490b_b046_7be30738505a.slice/crio-a070ad6c78e95935a9bb4b277522569ca30cfb54f99509b53bae1c4e00a533fa WatchSource:0}: Error finding container a070ad6c78e95935a9bb4b277522569ca30cfb54f99509b53bae1c4e00a533fa: Status 404 returned error can't find the container with id a070ad6c78e95935a9bb4b277522569ca30cfb54f99509b53bae1c4e00a533fa Mar 18 15:59:49 crc kubenswrapper[4696]: I0318 15:59:49.431636 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mvr4r" Mar 18 15:59:49 crc kubenswrapper[4696]: I0318 15:59:49.433590 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mvr4r" Mar 18 15:59:49 crc kubenswrapper[4696]: I0318 15:59:49.506243 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mvr4r" Mar 18 15:59:49 crc kubenswrapper[4696]: I0318 15:59:49.520018 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6b790582-ecd0-41b7-8f9c-f0ef9d2415db","Type":"ContainerStarted","Data":"863ced85c284eb5952ba75b4d8fb680f0f814ec97c99fccf31501cc42820cf71"} Mar 18 15:59:49 crc kubenswrapper[4696]: I0318 15:59:49.520151 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 15:59:49 crc kubenswrapper[4696]: I0318 15:59:49.528219 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdc7971b-ce0d-490b-b046-7be30738505a","Type":"ContainerStarted","Data":"a070ad6c78e95935a9bb4b277522569ca30cfb54f99509b53bae1c4e00a533fa"} Mar 18 15:59:49 crc kubenswrapper[4696]: I0318 15:59:49.569480 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=4.070838277 podStartE2EDuration="7.569451792s" podCreationTimestamp="2026-03-18 15:59:42 +0000 UTC" firstStartedPulling="2026-03-18 15:59:43.765293524 +0000 UTC m=+1426.771467730" lastFinishedPulling="2026-03-18 15:59:47.263907039 +0000 UTC m=+1430.270081245" observedRunningTime="2026-03-18 15:59:49.561227415 +0000 UTC m=+1432.567401641" watchObservedRunningTime="2026-03-18 15:59:49.569451792 +0000 UTC m=+1432.575625998" Mar 18 15:59:49 crc kubenswrapper[4696]: I0318 15:59:49.612273 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mvr4r" Mar 18 15:59:49 crc kubenswrapper[4696]: I0318 15:59:49.758679 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mvr4r"] Mar 18 15:59:51 crc kubenswrapper[4696]: I0318 15:59:51.321772 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-btct4" Mar 18 15:59:51 crc kubenswrapper[4696]: I0318 15:59:51.396152 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-b2tn8"] Mar 18 15:59:51 crc kubenswrapper[4696]: I0318 15:59:51.396806 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" podUID="25e269f9-a3d2-48a3-ac5b-dcdc18a31107" containerName="dnsmasq-dns" containerID="cri-o://8b584819fe9014bb22195965686d5a101df9b02fefcd55614758091106e8ffb5" gracePeriod=10 Mar 18 15:59:51 crc kubenswrapper[4696]: I0318 15:59:51.549132 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mvr4r" podUID="0644b4b4-173e-45a4-a79d-0127945fbf38" containerName="registry-server" containerID="cri-o://03d3b73ff93f3d6e01e9d320cce255d3157293f9310b8e582cb8fe903aa921f4" gracePeriod=2 Mar 18 15:59:52 crc kubenswrapper[4696]: I0318 15:59:52.671911 4696 generic.go:334] "Generic (PLEG): container finished" podID="0644b4b4-173e-45a4-a79d-0127945fbf38" containerID="03d3b73ff93f3d6e01e9d320cce255d3157293f9310b8e582cb8fe903aa921f4" exitCode=0 Mar 18 15:59:52 crc kubenswrapper[4696]: I0318 15:59:52.672441 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mvr4r" event={"ID":"0644b4b4-173e-45a4-a79d-0127945fbf38","Type":"ContainerDied","Data":"03d3b73ff93f3d6e01e9d320cce255d3157293f9310b8e582cb8fe903aa921f4"} Mar 18 15:59:52 crc kubenswrapper[4696]: I0318 15:59:52.744803 4696 generic.go:334] "Generic (PLEG): container finished" podID="25e269f9-a3d2-48a3-ac5b-dcdc18a31107" containerID="8b584819fe9014bb22195965686d5a101df9b02fefcd55614758091106e8ffb5" exitCode=0 Mar 18 15:59:52 crc kubenswrapper[4696]: I0318 15:59:52.745189 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" event={"ID":"25e269f9-a3d2-48a3-ac5b-dcdc18a31107","Type":"ContainerDied","Data":"8b584819fe9014bb22195965686d5a101df9b02fefcd55614758091106e8ffb5"} Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.357019 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mvr4r" Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.470109 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gvgs\" (UniqueName: \"kubernetes.io/projected/0644b4b4-173e-45a4-a79d-0127945fbf38-kube-api-access-4gvgs\") pod \"0644b4b4-173e-45a4-a79d-0127945fbf38\" (UID: \"0644b4b4-173e-45a4-a79d-0127945fbf38\") " Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.470286 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0644b4b4-173e-45a4-a79d-0127945fbf38-catalog-content\") pod \"0644b4b4-173e-45a4-a79d-0127945fbf38\" (UID: \"0644b4b4-173e-45a4-a79d-0127945fbf38\") " Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.470429 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0644b4b4-173e-45a4-a79d-0127945fbf38-utilities\") pod \"0644b4b4-173e-45a4-a79d-0127945fbf38\" (UID: \"0644b4b4-173e-45a4-a79d-0127945fbf38\") " Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.477067 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0644b4b4-173e-45a4-a79d-0127945fbf38-utilities" (OuterVolumeSpecName: "utilities") pod "0644b4b4-173e-45a4-a79d-0127945fbf38" (UID: "0644b4b4-173e-45a4-a79d-0127945fbf38"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.478878 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0644b4b4-173e-45a4-a79d-0127945fbf38-kube-api-access-4gvgs" (OuterVolumeSpecName: "kube-api-access-4gvgs") pod "0644b4b4-173e-45a4-a79d-0127945fbf38" (UID: "0644b4b4-173e-45a4-a79d-0127945fbf38"). InnerVolumeSpecName "kube-api-access-4gvgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.522386 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0644b4b4-173e-45a4-a79d-0127945fbf38-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0644b4b4-173e-45a4-a79d-0127945fbf38" (UID: "0644b4b4-173e-45a4-a79d-0127945fbf38"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.573358 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gvgs\" (UniqueName: \"kubernetes.io/projected/0644b4b4-173e-45a4-a79d-0127945fbf38-kube-api-access-4gvgs\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.573412 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0644b4b4-173e-45a4-a79d-0127945fbf38-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.573427 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0644b4b4-173e-45a4-a79d-0127945fbf38-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.682688 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.767668 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" event={"ID":"25e269f9-a3d2-48a3-ac5b-dcdc18a31107","Type":"ContainerDied","Data":"8ce0f5b33490c37fed3bdf60182c807607e378dfc1b1ea70b4a413df73a33872"} Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.767744 4696 scope.go:117] "RemoveContainer" containerID="8b584819fe9014bb22195965686d5a101df9b02fefcd55614758091106e8ffb5" Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.767901 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-b2tn8" Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.777877 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-ovsdbserver-sb\") pod \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\" (UID: \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\") " Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.778028 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-config\") pod \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\" (UID: \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\") " Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.778116 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-dns-swift-storage-0\") pod \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\" (UID: \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\") " Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.778164 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-ovsdbserver-nb\") pod \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\" (UID: \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\") " Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.778197 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gqt9\" (UniqueName: \"kubernetes.io/projected/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-kube-api-access-5gqt9\") pod \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\" (UID: \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\") " Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.778234 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-dns-svc\") pod \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\" (UID: \"25e269f9-a3d2-48a3-ac5b-dcdc18a31107\") " Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.782297 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mvr4r" event={"ID":"0644b4b4-173e-45a4-a79d-0127945fbf38","Type":"ContainerDied","Data":"9de7b7dda8cf05c97dbf56640f13d62e3a77315df2411a3ad435fada7d819b4c"} Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.782713 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mvr4r" Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.789684 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-kube-api-access-5gqt9" (OuterVolumeSpecName: "kube-api-access-5gqt9") pod "25e269f9-a3d2-48a3-ac5b-dcdc18a31107" (UID: "25e269f9-a3d2-48a3-ac5b-dcdc18a31107"). InnerVolumeSpecName "kube-api-access-5gqt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.803609 4696 scope.go:117] "RemoveContainer" containerID="f6ee94ca50e28aff25324321004feee98995fb655e66703e42ad66981b6b8600" Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.882339 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gqt9\" (UniqueName: \"kubernetes.io/projected/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-kube-api-access-5gqt9\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:53 crc kubenswrapper[4696]: I0318 15:59:53.993497 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "25e269f9-a3d2-48a3-ac5b-dcdc18a31107" (UID: "25e269f9-a3d2-48a3-ac5b-dcdc18a31107"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.011148 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "25e269f9-a3d2-48a3-ac5b-dcdc18a31107" (UID: "25e269f9-a3d2-48a3-ac5b-dcdc18a31107"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.020294 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-config" (OuterVolumeSpecName: "config") pod "25e269f9-a3d2-48a3-ac5b-dcdc18a31107" (UID: "25e269f9-a3d2-48a3-ac5b-dcdc18a31107"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.024004 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "25e269f9-a3d2-48a3-ac5b-dcdc18a31107" (UID: "25e269f9-a3d2-48a3-ac5b-dcdc18a31107"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.028222 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "25e269f9-a3d2-48a3-ac5b-dcdc18a31107" (UID: "25e269f9-a3d2-48a3-ac5b-dcdc18a31107"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.089839 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.090145 4696 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.090247 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.090354 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.090441 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25e269f9-a3d2-48a3-ac5b-dcdc18a31107-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.104782 4696 scope.go:117] "RemoveContainer" containerID="03d3b73ff93f3d6e01e9d320cce255d3157293f9310b8e582cb8fe903aa921f4" Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.128716 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mvr4r"] Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.140576 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mvr4r"] Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.157779 4696 scope.go:117] "RemoveContainer" containerID="8b17630054edf3c80983c9cc1402c8f9da1b3708ac7c143f27613abe076abc76" Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.161875 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-b2tn8"] Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.170833 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-b2tn8"] Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.190957 4696 scope.go:117] "RemoveContainer" containerID="256925df785d959e23310b6e6d935c4da16217ccd1a20a9880748c4766835562" Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.794582 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79499b4b-4312-4031-913b-4aef8bc0a47b","Type":"ContainerStarted","Data":"8264fcac528ea2d14e28ce0b2b9278ab58afca379e4a0e0ee5265d8f1470ba2c"} Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.796236 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79499b4b-4312-4031-913b-4aef8bc0a47b","Type":"ContainerStarted","Data":"f8bce6e6ba021e4da80fa4e3335bfcacc3ea75f15e59e78cbf42706a3bd2cf83"} Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.796170 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="79499b4b-4312-4031-913b-4aef8bc0a47b" containerName="nova-metadata-metadata" containerID="cri-o://8264fcac528ea2d14e28ce0b2b9278ab58afca379e4a0e0ee5265d8f1470ba2c" gracePeriod=30 Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.795095 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="79499b4b-4312-4031-913b-4aef8bc0a47b" containerName="nova-metadata-log" containerID="cri-o://f8bce6e6ba021e4da80fa4e3335bfcacc3ea75f15e59e78cbf42706a3bd2cf83" gracePeriod=30 Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.805062 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"008931c6-5d36-4eea-954c-0f583bde3955","Type":"ContainerStarted","Data":"9d712819b001e31abb5b8f3b8ddf668faf125f5c623ae6e243b35577ffe746c7"} Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.805097 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="008931c6-5d36-4eea-954c-0f583bde3955" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://9d712819b001e31abb5b8f3b8ddf668faf125f5c623ae6e243b35577ffe746c7" gracePeriod=30 Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.825866 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5b4dcf15-5ff3-4cca-885b-1da4ea205dc2","Type":"ContainerStarted","Data":"b902172b79cce87298d9be2f6726aba63be2a6ac68752727792985889948c8c3"} Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.832063 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=5.33555193 podStartE2EDuration="14.832027234s" podCreationTimestamp="2026-03-18 15:59:40 +0000 UTC" firstStartedPulling="2026-03-18 15:59:43.721086717 +0000 UTC m=+1426.727260923" lastFinishedPulling="2026-03-18 15:59:53.217562011 +0000 UTC m=+1436.223736227" observedRunningTime="2026-03-18 15:59:54.825171101 +0000 UTC m=+1437.831345317" watchObservedRunningTime="2026-03-18 15:59:54.832027234 +0000 UTC m=+1437.838201450" Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.842954 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8e93819a-4d0e-4261-a89c-333f1558c5e6","Type":"ContainerStarted","Data":"c488c3278923950f4ec07fa85edf67ffda6c5a3ca671e2a700cc259b25f5adeb"} Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.843024 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8e93819a-4d0e-4261-a89c-333f1558c5e6","Type":"ContainerStarted","Data":"f0d97ef1c5cb8853968c7f9bdc9de09fb4b507229641b9c977c7fc2615866fcd"} Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.865112 4696 generic.go:334] "Generic (PLEG): container finished" podID="29e3998f-8e86-422d-bfdc-093018e4e311" containerID="d92ae8dabbaaa98864ddff9e17467d0e4796962852a20add4f2473e5772709b1" exitCode=0 Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.866327 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-m56xp" event={"ID":"29e3998f-8e86-422d-bfdc-093018e4e311","Type":"ContainerDied","Data":"d92ae8dabbaaa98864ddff9e17467d0e4796962852a20add4f2473e5772709b1"} Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.866731 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.706019749 podStartE2EDuration="13.866710357s" podCreationTimestamp="2026-03-18 15:59:41 +0000 UTC" firstStartedPulling="2026-03-18 15:59:43.056102784 +0000 UTC m=+1426.062276990" lastFinishedPulling="2026-03-18 15:59:53.216793392 +0000 UTC m=+1436.222967598" observedRunningTime="2026-03-18 15:59:54.846784055 +0000 UTC m=+1437.852958281" watchObservedRunningTime="2026-03-18 15:59:54.866710357 +0000 UTC m=+1437.872884563" Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.880813 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdc7971b-ce0d-490b-b046-7be30738505a","Type":"ContainerStarted","Data":"3589742346c6aebfd1e9e14560ed9c396299d74a20ca99b7d428c654123a9b28"} Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.881193 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdc7971b-ce0d-490b-b046-7be30738505a","Type":"ContainerStarted","Data":"953ae1a9222abd277f53185b46363be0b7b2e7b36397ee31fe0c6140f9e71e53"} Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.890239 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=5.156120918 podStartE2EDuration="14.890208409s" podCreationTimestamp="2026-03-18 15:59:40 +0000 UTC" firstStartedPulling="2026-03-18 15:59:43.506632213 +0000 UTC m=+1426.512806419" lastFinishedPulling="2026-03-18 15:59:53.240719704 +0000 UTC m=+1436.246893910" observedRunningTime="2026-03-18 15:59:54.88033032 +0000 UTC m=+1437.886504536" watchObservedRunningTime="2026-03-18 15:59:54.890208409 +0000 UTC m=+1437.896382615" Mar 18 15:59:54 crc kubenswrapper[4696]: I0318 15:59:54.961949 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=5.478074624 podStartE2EDuration="14.961917764s" podCreationTimestamp="2026-03-18 15:59:40 +0000 UTC" firstStartedPulling="2026-03-18 15:59:43.749650322 +0000 UTC m=+1426.755824528" lastFinishedPulling="2026-03-18 15:59:53.233493452 +0000 UTC m=+1436.239667668" observedRunningTime="2026-03-18 15:59:54.922981284 +0000 UTC m=+1437.929155510" watchObservedRunningTime="2026-03-18 15:59:54.961917764 +0000 UTC m=+1437.968091990" Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.609494 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.651015 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0644b4b4-173e-45a4-a79d-0127945fbf38" path="/var/lib/kubelet/pods/0644b4b4-173e-45a4-a79d-0127945fbf38/volumes" Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.653881 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e269f9-a3d2-48a3-ac5b-dcdc18a31107" path="/var/lib/kubelet/pods/25e269f9-a3d2-48a3-ac5b-dcdc18a31107/volumes" Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.747897 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99cd5\" (UniqueName: \"kubernetes.io/projected/79499b4b-4312-4031-913b-4aef8bc0a47b-kube-api-access-99cd5\") pod \"79499b4b-4312-4031-913b-4aef8bc0a47b\" (UID: \"79499b4b-4312-4031-913b-4aef8bc0a47b\") " Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.748222 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79499b4b-4312-4031-913b-4aef8bc0a47b-logs\") pod \"79499b4b-4312-4031-913b-4aef8bc0a47b\" (UID: \"79499b4b-4312-4031-913b-4aef8bc0a47b\") " Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.748297 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79499b4b-4312-4031-913b-4aef8bc0a47b-config-data\") pod \"79499b4b-4312-4031-913b-4aef8bc0a47b\" (UID: \"79499b4b-4312-4031-913b-4aef8bc0a47b\") " Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.748355 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79499b4b-4312-4031-913b-4aef8bc0a47b-combined-ca-bundle\") pod \"79499b4b-4312-4031-913b-4aef8bc0a47b\" (UID: \"79499b4b-4312-4031-913b-4aef8bc0a47b\") " Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.754971 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79499b4b-4312-4031-913b-4aef8bc0a47b-logs" (OuterVolumeSpecName: "logs") pod "79499b4b-4312-4031-913b-4aef8bc0a47b" (UID: "79499b4b-4312-4031-913b-4aef8bc0a47b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.757634 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79499b4b-4312-4031-913b-4aef8bc0a47b-kube-api-access-99cd5" (OuterVolumeSpecName: "kube-api-access-99cd5") pod "79499b4b-4312-4031-913b-4aef8bc0a47b" (UID: "79499b4b-4312-4031-913b-4aef8bc0a47b"). InnerVolumeSpecName "kube-api-access-99cd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.790148 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79499b4b-4312-4031-913b-4aef8bc0a47b-config-data" (OuterVolumeSpecName: "config-data") pod "79499b4b-4312-4031-913b-4aef8bc0a47b" (UID: "79499b4b-4312-4031-913b-4aef8bc0a47b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.835977 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79499b4b-4312-4031-913b-4aef8bc0a47b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79499b4b-4312-4031-913b-4aef8bc0a47b" (UID: "79499b4b-4312-4031-913b-4aef8bc0a47b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.851139 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79499b4b-4312-4031-913b-4aef8bc0a47b-logs\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.851187 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79499b4b-4312-4031-913b-4aef8bc0a47b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.851203 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79499b4b-4312-4031-913b-4aef8bc0a47b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.851223 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99cd5\" (UniqueName: \"kubernetes.io/projected/79499b4b-4312-4031-913b-4aef8bc0a47b-kube-api-access-99cd5\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.903375 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdc7971b-ce0d-490b-b046-7be30738505a","Type":"ContainerStarted","Data":"a01df05cf9c30d3fe815a40c748f360fc726e738754b228dd838e8759b3e72c2"} Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.906252 4696 generic.go:334] "Generic (PLEG): container finished" podID="79499b4b-4312-4031-913b-4aef8bc0a47b" containerID="8264fcac528ea2d14e28ce0b2b9278ab58afca379e4a0e0ee5265d8f1470ba2c" exitCode=0 Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.906340 4696 generic.go:334] "Generic (PLEG): container finished" podID="79499b4b-4312-4031-913b-4aef8bc0a47b" containerID="f8bce6e6ba021e4da80fa4e3335bfcacc3ea75f15e59e78cbf42706a3bd2cf83" exitCode=143 Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.906716 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.913098 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79499b4b-4312-4031-913b-4aef8bc0a47b","Type":"ContainerDied","Data":"8264fcac528ea2d14e28ce0b2b9278ab58afca379e4a0e0ee5265d8f1470ba2c"} Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.913184 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79499b4b-4312-4031-913b-4aef8bc0a47b","Type":"ContainerDied","Data":"f8bce6e6ba021e4da80fa4e3335bfcacc3ea75f15e59e78cbf42706a3bd2cf83"} Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.913200 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79499b4b-4312-4031-913b-4aef8bc0a47b","Type":"ContainerDied","Data":"5e5511ba3762d84bb4c3930c704bb7f394841b1b5ec4431c38f5e5f3067d883a"} Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.913223 4696 scope.go:117] "RemoveContainer" containerID="8264fcac528ea2d14e28ce0b2b9278ab58afca379e4a0e0ee5265d8f1470ba2c" Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.984006 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 15:59:55 crc kubenswrapper[4696]: I0318 15:59:55.985795 4696 scope.go:117] "RemoveContainer" containerID="f8bce6e6ba021e4da80fa4e3335bfcacc3ea75f15e59e78cbf42706a3bd2cf83" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.013576 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.025660 4696 scope.go:117] "RemoveContainer" containerID="8264fcac528ea2d14e28ce0b2b9278ab58afca379e4a0e0ee5265d8f1470ba2c" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.032234 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 15:59:56 crc kubenswrapper[4696]: E0318 15:59:56.033420 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e269f9-a3d2-48a3-ac5b-dcdc18a31107" containerName="init" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.033451 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e269f9-a3d2-48a3-ac5b-dcdc18a31107" containerName="init" Mar 18 15:59:56 crc kubenswrapper[4696]: E0318 15:59:56.033478 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0644b4b4-173e-45a4-a79d-0127945fbf38" containerName="registry-server" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.033486 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0644b4b4-173e-45a4-a79d-0127945fbf38" containerName="registry-server" Mar 18 15:59:56 crc kubenswrapper[4696]: E0318 15:59:56.033502 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0644b4b4-173e-45a4-a79d-0127945fbf38" containerName="extract-utilities" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.033510 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0644b4b4-173e-45a4-a79d-0127945fbf38" containerName="extract-utilities" Mar 18 15:59:56 crc kubenswrapper[4696]: E0318 15:59:56.033534 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0644b4b4-173e-45a4-a79d-0127945fbf38" containerName="extract-content" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.033544 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0644b4b4-173e-45a4-a79d-0127945fbf38" containerName="extract-content" Mar 18 15:59:56 crc kubenswrapper[4696]: E0318 15:59:56.033565 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e269f9-a3d2-48a3-ac5b-dcdc18a31107" containerName="dnsmasq-dns" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.033572 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e269f9-a3d2-48a3-ac5b-dcdc18a31107" containerName="dnsmasq-dns" Mar 18 15:59:56 crc kubenswrapper[4696]: E0318 15:59:56.033593 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79499b4b-4312-4031-913b-4aef8bc0a47b" containerName="nova-metadata-log" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.033600 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="79499b4b-4312-4031-913b-4aef8bc0a47b" containerName="nova-metadata-log" Mar 18 15:59:56 crc kubenswrapper[4696]: E0318 15:59:56.033616 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79499b4b-4312-4031-913b-4aef8bc0a47b" containerName="nova-metadata-metadata" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.033624 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="79499b4b-4312-4031-913b-4aef8bc0a47b" containerName="nova-metadata-metadata" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.033882 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="79499b4b-4312-4031-913b-4aef8bc0a47b" containerName="nova-metadata-metadata" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.033904 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="25e269f9-a3d2-48a3-ac5b-dcdc18a31107" containerName="dnsmasq-dns" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.033913 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="79499b4b-4312-4031-913b-4aef8bc0a47b" containerName="nova-metadata-log" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.033923 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="0644b4b4-173e-45a4-a79d-0127945fbf38" containerName="registry-server" Mar 18 15:59:56 crc kubenswrapper[4696]: E0318 15:59:56.034762 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8264fcac528ea2d14e28ce0b2b9278ab58afca379e4a0e0ee5265d8f1470ba2c\": container with ID starting with 8264fcac528ea2d14e28ce0b2b9278ab58afca379e4a0e0ee5265d8f1470ba2c not found: ID does not exist" containerID="8264fcac528ea2d14e28ce0b2b9278ab58afca379e4a0e0ee5265d8f1470ba2c" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.034846 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8264fcac528ea2d14e28ce0b2b9278ab58afca379e4a0e0ee5265d8f1470ba2c"} err="failed to get container status \"8264fcac528ea2d14e28ce0b2b9278ab58afca379e4a0e0ee5265d8f1470ba2c\": rpc error: code = NotFound desc = could not find container \"8264fcac528ea2d14e28ce0b2b9278ab58afca379e4a0e0ee5265d8f1470ba2c\": container with ID starting with 8264fcac528ea2d14e28ce0b2b9278ab58afca379e4a0e0ee5265d8f1470ba2c not found: ID does not exist" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.034896 4696 scope.go:117] "RemoveContainer" containerID="f8bce6e6ba021e4da80fa4e3335bfcacc3ea75f15e59e78cbf42706a3bd2cf83" Mar 18 15:59:56 crc kubenswrapper[4696]: E0318 15:59:56.035340 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8bce6e6ba021e4da80fa4e3335bfcacc3ea75f15e59e78cbf42706a3bd2cf83\": container with ID starting with f8bce6e6ba021e4da80fa4e3335bfcacc3ea75f15e59e78cbf42706a3bd2cf83 not found: ID does not exist" containerID="f8bce6e6ba021e4da80fa4e3335bfcacc3ea75f15e59e78cbf42706a3bd2cf83" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.035369 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8bce6e6ba021e4da80fa4e3335bfcacc3ea75f15e59e78cbf42706a3bd2cf83"} err="failed to get container status \"f8bce6e6ba021e4da80fa4e3335bfcacc3ea75f15e59e78cbf42706a3bd2cf83\": rpc error: code = NotFound desc = could not find container \"f8bce6e6ba021e4da80fa4e3335bfcacc3ea75f15e59e78cbf42706a3bd2cf83\": container with ID starting with f8bce6e6ba021e4da80fa4e3335bfcacc3ea75f15e59e78cbf42706a3bd2cf83 not found: ID does not exist" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.035391 4696 scope.go:117] "RemoveContainer" containerID="8264fcac528ea2d14e28ce0b2b9278ab58afca379e4a0e0ee5265d8f1470ba2c" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.035874 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.038649 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8264fcac528ea2d14e28ce0b2b9278ab58afca379e4a0e0ee5265d8f1470ba2c"} err="failed to get container status \"8264fcac528ea2d14e28ce0b2b9278ab58afca379e4a0e0ee5265d8f1470ba2c\": rpc error: code = NotFound desc = could not find container \"8264fcac528ea2d14e28ce0b2b9278ab58afca379e4a0e0ee5265d8f1470ba2c\": container with ID starting with 8264fcac528ea2d14e28ce0b2b9278ab58afca379e4a0e0ee5265d8f1470ba2c not found: ID does not exist" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.038680 4696 scope.go:117] "RemoveContainer" containerID="f8bce6e6ba021e4da80fa4e3335bfcacc3ea75f15e59e78cbf42706a3bd2cf83" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.040977 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8bce6e6ba021e4da80fa4e3335bfcacc3ea75f15e59e78cbf42706a3bd2cf83"} err="failed to get container status \"f8bce6e6ba021e4da80fa4e3335bfcacc3ea75f15e59e78cbf42706a3bd2cf83\": rpc error: code = NotFound desc = could not find container \"f8bce6e6ba021e4da80fa4e3335bfcacc3ea75f15e59e78cbf42706a3bd2cf83\": container with ID starting with f8bce6e6ba021e4da80fa4e3335bfcacc3ea75f15e59e78cbf42706a3bd2cf83 not found: ID does not exist" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.041437 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.043738 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.077614 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.087324 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.165355 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn726\" (UniqueName: \"kubernetes.io/projected/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-kube-api-access-vn726\") pod \"nova-metadata-0\" (UID: \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\") " pod="openstack/nova-metadata-0" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.165441 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-config-data\") pod \"nova-metadata-0\" (UID: \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\") " pod="openstack/nova-metadata-0" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.165505 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\") " pod="openstack/nova-metadata-0" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.165655 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\") " pod="openstack/nova-metadata-0" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.165721 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-logs\") pod \"nova-metadata-0\" (UID: \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\") " pod="openstack/nova-metadata-0" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.268254 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\") " pod="openstack/nova-metadata-0" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.268385 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-logs\") pod \"nova-metadata-0\" (UID: \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\") " pod="openstack/nova-metadata-0" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.268453 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn726\" (UniqueName: \"kubernetes.io/projected/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-kube-api-access-vn726\") pod \"nova-metadata-0\" (UID: \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\") " pod="openstack/nova-metadata-0" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.268511 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-config-data\") pod \"nova-metadata-0\" (UID: \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\") " pod="openstack/nova-metadata-0" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.268604 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\") " pod="openstack/nova-metadata-0" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.269229 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-logs\") pod \"nova-metadata-0\" (UID: \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\") " pod="openstack/nova-metadata-0" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.275133 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\") " pod="openstack/nova-metadata-0" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.279419 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-config-data\") pod \"nova-metadata-0\" (UID: \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\") " pod="openstack/nova-metadata-0" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.279985 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\") " pod="openstack/nova-metadata-0" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.294418 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn726\" (UniqueName: \"kubernetes.io/projected/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-kube-api-access-vn726\") pod \"nova-metadata-0\" (UID: \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\") " pod="openstack/nova-metadata-0" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.341858 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-m56xp" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.407560 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.472144 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e3998f-8e86-422d-bfdc-093018e4e311-combined-ca-bundle\") pod \"29e3998f-8e86-422d-bfdc-093018e4e311\" (UID: \"29e3998f-8e86-422d-bfdc-093018e4e311\") " Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.472284 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29e3998f-8e86-422d-bfdc-093018e4e311-config-data\") pod \"29e3998f-8e86-422d-bfdc-093018e4e311\" (UID: \"29e3998f-8e86-422d-bfdc-093018e4e311\") " Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.472402 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l64qw\" (UniqueName: \"kubernetes.io/projected/29e3998f-8e86-422d-bfdc-093018e4e311-kube-api-access-l64qw\") pod \"29e3998f-8e86-422d-bfdc-093018e4e311\" (UID: \"29e3998f-8e86-422d-bfdc-093018e4e311\") " Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.472732 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29e3998f-8e86-422d-bfdc-093018e4e311-scripts\") pod \"29e3998f-8e86-422d-bfdc-093018e4e311\" (UID: \"29e3998f-8e86-422d-bfdc-093018e4e311\") " Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.482655 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29e3998f-8e86-422d-bfdc-093018e4e311-kube-api-access-l64qw" (OuterVolumeSpecName: "kube-api-access-l64qw") pod "29e3998f-8e86-422d-bfdc-093018e4e311" (UID: "29e3998f-8e86-422d-bfdc-093018e4e311"). InnerVolumeSpecName "kube-api-access-l64qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.483066 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e3998f-8e86-422d-bfdc-093018e4e311-scripts" (OuterVolumeSpecName: "scripts") pod "29e3998f-8e86-422d-bfdc-093018e4e311" (UID: "29e3998f-8e86-422d-bfdc-093018e4e311"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.502221 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.553382 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e3998f-8e86-422d-bfdc-093018e4e311-config-data" (OuterVolumeSpecName: "config-data") pod "29e3998f-8e86-422d-bfdc-093018e4e311" (UID: "29e3998f-8e86-422d-bfdc-093018e4e311"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.560800 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e3998f-8e86-422d-bfdc-093018e4e311-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29e3998f-8e86-422d-bfdc-093018e4e311" (UID: "29e3998f-8e86-422d-bfdc-093018e4e311"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.578077 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29e3998f-8e86-422d-bfdc-093018e4e311-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.578130 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e3998f-8e86-422d-bfdc-093018e4e311-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.578148 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29e3998f-8e86-422d-bfdc-093018e4e311-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.578160 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l64qw\" (UniqueName: \"kubernetes.io/projected/29e3998f-8e86-422d-bfdc-093018e4e311-kube-api-access-l64qw\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.926946 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-m56xp" event={"ID":"29e3998f-8e86-422d-bfdc-093018e4e311","Type":"ContainerDied","Data":"13d69e151a6af5a010c98428ee5d93f118223f32f844dabeec52fa42d63c2f5b"} Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.927238 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13d69e151a6af5a010c98428ee5d93f118223f32f844dabeec52fa42d63c2f5b" Mar 18 15:59:56 crc kubenswrapper[4696]: I0318 15:59:56.927329 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-m56xp" Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.114926 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 15:59:57 crc kubenswrapper[4696]: W0318 15:59:57.119391 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70c8a4f7_6983_4b1a_a959_25f4d04a6b30.slice/crio-b8e2c9577bcf3d364c1ddcd4e95c957dd9ee6aa23711e3c4b65bc5fbece9d7b8 WatchSource:0}: Error finding container b8e2c9577bcf3d364c1ddcd4e95c957dd9ee6aa23711e3c4b65bc5fbece9d7b8: Status 404 returned error can't find the container with id b8e2c9577bcf3d364c1ddcd4e95c957dd9ee6aa23711e3c4b65bc5fbece9d7b8 Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.146932 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.147259 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8e93819a-4d0e-4261-a89c-333f1558c5e6" containerName="nova-api-log" containerID="cri-o://f0d97ef1c5cb8853968c7f9bdc9de09fb4b507229641b9c977c7fc2615866fcd" gracePeriod=30 Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.147323 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8e93819a-4d0e-4261-a89c-333f1558c5e6" containerName="nova-api-api" containerID="cri-o://c488c3278923950f4ec07fa85edf67ffda6c5a3ca671e2a700cc259b25f5adeb" gracePeriod=30 Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.188142 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.225388 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.637281 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79499b4b-4312-4031-913b-4aef8bc0a47b" path="/var/lib/kubelet/pods/79499b4b-4312-4031-913b-4aef8bc0a47b/volumes" Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.889927 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.980461 4696 generic.go:334] "Generic (PLEG): container finished" podID="8e93819a-4d0e-4261-a89c-333f1558c5e6" containerID="c488c3278923950f4ec07fa85edf67ffda6c5a3ca671e2a700cc259b25f5adeb" exitCode=0 Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.980514 4696 generic.go:334] "Generic (PLEG): container finished" podID="8e93819a-4d0e-4261-a89c-333f1558c5e6" containerID="f0d97ef1c5cb8853968c7f9bdc9de09fb4b507229641b9c977c7fc2615866fcd" exitCode=143 Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.980730 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.980901 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8e93819a-4d0e-4261-a89c-333f1558c5e6","Type":"ContainerDied","Data":"c488c3278923950f4ec07fa85edf67ffda6c5a3ca671e2a700cc259b25f5adeb"} Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.980984 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8e93819a-4d0e-4261-a89c-333f1558c5e6","Type":"ContainerDied","Data":"f0d97ef1c5cb8853968c7f9bdc9de09fb4b507229641b9c977c7fc2615866fcd"} Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.980998 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8e93819a-4d0e-4261-a89c-333f1558c5e6","Type":"ContainerDied","Data":"5b586bdb2d59f485e87a15c417255d50739c0b8bd804fe96ae89f4b4f35fd051"} Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.981024 4696 scope.go:117] "RemoveContainer" containerID="c488c3278923950f4ec07fa85edf67ffda6c5a3ca671e2a700cc259b25f5adeb" Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.987504 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e93819a-4d0e-4261-a89c-333f1558c5e6-combined-ca-bundle\") pod \"8e93819a-4d0e-4261-a89c-333f1558c5e6\" (UID: \"8e93819a-4d0e-4261-a89c-333f1558c5e6\") " Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.987707 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgp2d\" (UniqueName: \"kubernetes.io/projected/8e93819a-4d0e-4261-a89c-333f1558c5e6-kube-api-access-jgp2d\") pod \"8e93819a-4d0e-4261-a89c-333f1558c5e6\" (UID: \"8e93819a-4d0e-4261-a89c-333f1558c5e6\") " Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.987959 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e93819a-4d0e-4261-a89c-333f1558c5e6-logs\") pod \"8e93819a-4d0e-4261-a89c-333f1558c5e6\" (UID: \"8e93819a-4d0e-4261-a89c-333f1558c5e6\") " Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.988032 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e93819a-4d0e-4261-a89c-333f1558c5e6-config-data\") pod \"8e93819a-4d0e-4261-a89c-333f1558c5e6\" (UID: \"8e93819a-4d0e-4261-a89c-333f1558c5e6\") " Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.988485 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e93819a-4d0e-4261-a89c-333f1558c5e6-logs" (OuterVolumeSpecName: "logs") pod "8e93819a-4d0e-4261-a89c-333f1558c5e6" (UID: "8e93819a-4d0e-4261-a89c-333f1558c5e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.989598 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e93819a-4d0e-4261-a89c-333f1558c5e6-logs\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.990499 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70c8a4f7-6983-4b1a-a959-25f4d04a6b30","Type":"ContainerStarted","Data":"a13b8d82e3cd2616dfde2f592ea2b5dd3ccc09efc6afcb34b3f7b86cb82e35e0"} Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.990560 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70c8a4f7-6983-4b1a-a959-25f4d04a6b30","Type":"ContainerStarted","Data":"35f36db869e64037a82ca0f2a3b679d1f0599c24ccad8c79cf42160006183508"} Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.990570 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70c8a4f7-6983-4b1a-a959-25f4d04a6b30","Type":"ContainerStarted","Data":"b8e2c9577bcf3d364c1ddcd4e95c957dd9ee6aa23711e3c4b65bc5fbece9d7b8"} Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.990757 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="70c8a4f7-6983-4b1a-a959-25f4d04a6b30" containerName="nova-metadata-log" containerID="cri-o://35f36db869e64037a82ca0f2a3b679d1f0599c24ccad8c79cf42160006183508" gracePeriod=30 Mar 18 15:59:57 crc kubenswrapper[4696]: I0318 15:59:57.991041 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="70c8a4f7-6983-4b1a-a959-25f4d04a6b30" containerName="nova-metadata-metadata" containerID="cri-o://a13b8d82e3cd2616dfde2f592ea2b5dd3ccc09efc6afcb34b3f7b86cb82e35e0" gracePeriod=30 Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.002120 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e93819a-4d0e-4261-a89c-333f1558c5e6-kube-api-access-jgp2d" (OuterVolumeSpecName: "kube-api-access-jgp2d") pod "8e93819a-4d0e-4261-a89c-333f1558c5e6" (UID: "8e93819a-4d0e-4261-a89c-333f1558c5e6"). InnerVolumeSpecName "kube-api-access-jgp2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.002704 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5b4dcf15-5ff3-4cca-885b-1da4ea205dc2" containerName="nova-scheduler-scheduler" containerID="cri-o://b902172b79cce87298d9be2f6726aba63be2a6ac68752727792985889948c8c3" gracePeriod=30 Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.003535 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdc7971b-ce0d-490b-b046-7be30738505a","Type":"ContainerStarted","Data":"3946083d2400a1e43d9cf4a178ed0f6bdf05b03c04b7eb34290bef7a358a53a1"} Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.005045 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.020333 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e93819a-4d0e-4261-a89c-333f1558c5e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e93819a-4d0e-4261-a89c-333f1558c5e6" (UID: "8e93819a-4d0e-4261-a89c-333f1558c5e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.059614 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.059585963 podStartE2EDuration="3.059585963s" podCreationTimestamp="2026-03-18 15:59:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:59:58.032358817 +0000 UTC m=+1441.038533033" watchObservedRunningTime="2026-03-18 15:59:58.059585963 +0000 UTC m=+1441.065760169" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.084430 4696 scope.go:117] "RemoveContainer" containerID="f0d97ef1c5cb8853968c7f9bdc9de09fb4b507229641b9c977c7fc2615866fcd" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.091451 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e93819a-4d0e-4261-a89c-333f1558c5e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.091480 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgp2d\" (UniqueName: \"kubernetes.io/projected/8e93819a-4d0e-4261-a89c-333f1558c5e6-kube-api-access-jgp2d\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.100010 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e93819a-4d0e-4261-a89c-333f1558c5e6-config-data" (OuterVolumeSpecName: "config-data") pod "8e93819a-4d0e-4261-a89c-333f1558c5e6" (UID: "8e93819a-4d0e-4261-a89c-333f1558c5e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.107190 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.298176123 podStartE2EDuration="12.107157751s" podCreationTimestamp="2026-03-18 15:59:46 +0000 UTC" firstStartedPulling="2026-03-18 15:59:48.539694464 +0000 UTC m=+1431.545868670" lastFinishedPulling="2026-03-18 15:59:57.348676092 +0000 UTC m=+1440.354850298" observedRunningTime="2026-03-18 15:59:58.097567189 +0000 UTC m=+1441.103741415" watchObservedRunningTime="2026-03-18 15:59:58.107157751 +0000 UTC m=+1441.113331967" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.136882 4696 scope.go:117] "RemoveContainer" containerID="c488c3278923950f4ec07fa85edf67ffda6c5a3ca671e2a700cc259b25f5adeb" Mar 18 15:59:58 crc kubenswrapper[4696]: E0318 15:59:58.137373 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c488c3278923950f4ec07fa85edf67ffda6c5a3ca671e2a700cc259b25f5adeb\": container with ID starting with c488c3278923950f4ec07fa85edf67ffda6c5a3ca671e2a700cc259b25f5adeb not found: ID does not exist" containerID="c488c3278923950f4ec07fa85edf67ffda6c5a3ca671e2a700cc259b25f5adeb" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.137415 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c488c3278923950f4ec07fa85edf67ffda6c5a3ca671e2a700cc259b25f5adeb"} err="failed to get container status \"c488c3278923950f4ec07fa85edf67ffda6c5a3ca671e2a700cc259b25f5adeb\": rpc error: code = NotFound desc = could not find container \"c488c3278923950f4ec07fa85edf67ffda6c5a3ca671e2a700cc259b25f5adeb\": container with ID starting with c488c3278923950f4ec07fa85edf67ffda6c5a3ca671e2a700cc259b25f5adeb not found: ID does not exist" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.137451 4696 scope.go:117] "RemoveContainer" containerID="f0d97ef1c5cb8853968c7f9bdc9de09fb4b507229641b9c977c7fc2615866fcd" Mar 18 15:59:58 crc kubenswrapper[4696]: E0318 15:59:58.138040 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0d97ef1c5cb8853968c7f9bdc9de09fb4b507229641b9c977c7fc2615866fcd\": container with ID starting with f0d97ef1c5cb8853968c7f9bdc9de09fb4b507229641b9c977c7fc2615866fcd not found: ID does not exist" containerID="f0d97ef1c5cb8853968c7f9bdc9de09fb4b507229641b9c977c7fc2615866fcd" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.138077 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0d97ef1c5cb8853968c7f9bdc9de09fb4b507229641b9c977c7fc2615866fcd"} err="failed to get container status \"f0d97ef1c5cb8853968c7f9bdc9de09fb4b507229641b9c977c7fc2615866fcd\": rpc error: code = NotFound desc = could not find container \"f0d97ef1c5cb8853968c7f9bdc9de09fb4b507229641b9c977c7fc2615866fcd\": container with ID starting with f0d97ef1c5cb8853968c7f9bdc9de09fb4b507229641b9c977c7fc2615866fcd not found: ID does not exist" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.138099 4696 scope.go:117] "RemoveContainer" containerID="c488c3278923950f4ec07fa85edf67ffda6c5a3ca671e2a700cc259b25f5adeb" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.138511 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c488c3278923950f4ec07fa85edf67ffda6c5a3ca671e2a700cc259b25f5adeb"} err="failed to get container status \"c488c3278923950f4ec07fa85edf67ffda6c5a3ca671e2a700cc259b25f5adeb\": rpc error: code = NotFound desc = could not find container \"c488c3278923950f4ec07fa85edf67ffda6c5a3ca671e2a700cc259b25f5adeb\": container with ID starting with c488c3278923950f4ec07fa85edf67ffda6c5a3ca671e2a700cc259b25f5adeb not found: ID does not exist" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.138553 4696 scope.go:117] "RemoveContainer" containerID="f0d97ef1c5cb8853968c7f9bdc9de09fb4b507229641b9c977c7fc2615866fcd" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.139664 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0d97ef1c5cb8853968c7f9bdc9de09fb4b507229641b9c977c7fc2615866fcd"} err="failed to get container status \"f0d97ef1c5cb8853968c7f9bdc9de09fb4b507229641b9c977c7fc2615866fcd\": rpc error: code = NotFound desc = could not find container \"f0d97ef1c5cb8853968c7f9bdc9de09fb4b507229641b9c977c7fc2615866fcd\": container with ID starting with f0d97ef1c5cb8853968c7f9bdc9de09fb4b507229641b9c977c7fc2615866fcd not found: ID does not exist" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.194146 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e93819a-4d0e-4261-a89c-333f1558c5e6-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.320826 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.334343 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.350198 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 15:59:58 crc kubenswrapper[4696]: E0318 15:59:58.350876 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e93819a-4d0e-4261-a89c-333f1558c5e6" containerName="nova-api-log" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.350905 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e93819a-4d0e-4261-a89c-333f1558c5e6" containerName="nova-api-log" Mar 18 15:59:58 crc kubenswrapper[4696]: E0318 15:59:58.350922 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e93819a-4d0e-4261-a89c-333f1558c5e6" containerName="nova-api-api" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.350931 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e93819a-4d0e-4261-a89c-333f1558c5e6" containerName="nova-api-api" Mar 18 15:59:58 crc kubenswrapper[4696]: E0318 15:59:58.350952 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29e3998f-8e86-422d-bfdc-093018e4e311" containerName="nova-manage" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.350960 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e3998f-8e86-422d-bfdc-093018e4e311" containerName="nova-manage" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.351269 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="29e3998f-8e86-422d-bfdc-093018e4e311" containerName="nova-manage" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.351293 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e93819a-4d0e-4261-a89c-333f1558c5e6" containerName="nova-api-api" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.351310 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e93819a-4d0e-4261-a89c-333f1558c5e6" containerName="nova-api-log" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.352823 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.356507 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.402136 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.501604 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1\") " pod="openstack/nova-api-0" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.501694 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1-logs\") pod \"nova-api-0\" (UID: \"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1\") " pod="openstack/nova-api-0" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.503776 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnlnx\" (UniqueName: \"kubernetes.io/projected/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1-kube-api-access-vnlnx\") pod \"nova-api-0\" (UID: \"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1\") " pod="openstack/nova-api-0" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.503868 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1-config-data\") pod \"nova-api-0\" (UID: \"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1\") " pod="openstack/nova-api-0" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.607537 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1\") " pod="openstack/nova-api-0" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.607929 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1-logs\") pod \"nova-api-0\" (UID: \"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1\") " pod="openstack/nova-api-0" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.608028 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnlnx\" (UniqueName: \"kubernetes.io/projected/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1-kube-api-access-vnlnx\") pod \"nova-api-0\" (UID: \"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1\") " pod="openstack/nova-api-0" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.608090 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1-config-data\") pod \"nova-api-0\" (UID: \"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1\") " pod="openstack/nova-api-0" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.609930 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1-logs\") pod \"nova-api-0\" (UID: \"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1\") " pod="openstack/nova-api-0" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.615539 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1\") " pod="openstack/nova-api-0" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.615578 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1-config-data\") pod \"nova-api-0\" (UID: \"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1\") " pod="openstack/nova-api-0" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.635181 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnlnx\" (UniqueName: \"kubernetes.io/projected/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1-kube-api-access-vnlnx\") pod \"nova-api-0\" (UID: \"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1\") " pod="openstack/nova-api-0" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.680494 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 15:59:58 crc kubenswrapper[4696]: I0318 15:59:58.902777 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.017664 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-nova-metadata-tls-certs\") pod \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\" (UID: \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\") " Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.017836 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-config-data\") pod \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\" (UID: \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\") " Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.017919 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-combined-ca-bundle\") pod \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\" (UID: \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\") " Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.017952 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn726\" (UniqueName: \"kubernetes.io/projected/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-kube-api-access-vn726\") pod \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\" (UID: \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\") " Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.017975 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-logs\") pod \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\" (UID: \"70c8a4f7-6983-4b1a-a959-25f4d04a6b30\") " Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.020513 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-logs" (OuterVolumeSpecName: "logs") pod "70c8a4f7-6983-4b1a-a959-25f4d04a6b30" (UID: "70c8a4f7-6983-4b1a-a959-25f4d04a6b30"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.028697 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-kube-api-access-vn726" (OuterVolumeSpecName: "kube-api-access-vn726") pod "70c8a4f7-6983-4b1a-a959-25f4d04a6b30" (UID: "70c8a4f7-6983-4b1a-a959-25f4d04a6b30"). InnerVolumeSpecName "kube-api-access-vn726". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.031223 4696 generic.go:334] "Generic (PLEG): container finished" podID="70c8a4f7-6983-4b1a-a959-25f4d04a6b30" containerID="a13b8d82e3cd2616dfde2f592ea2b5dd3ccc09efc6afcb34b3f7b86cb82e35e0" exitCode=0 Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.031251 4696 generic.go:334] "Generic (PLEG): container finished" podID="70c8a4f7-6983-4b1a-a959-25f4d04a6b30" containerID="35f36db869e64037a82ca0f2a3b679d1f0599c24ccad8c79cf42160006183508" exitCode=143 Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.031258 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.031330 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70c8a4f7-6983-4b1a-a959-25f4d04a6b30","Type":"ContainerDied","Data":"a13b8d82e3cd2616dfde2f592ea2b5dd3ccc09efc6afcb34b3f7b86cb82e35e0"} Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.031364 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70c8a4f7-6983-4b1a-a959-25f4d04a6b30","Type":"ContainerDied","Data":"35f36db869e64037a82ca0f2a3b679d1f0599c24ccad8c79cf42160006183508"} Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.031396 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70c8a4f7-6983-4b1a-a959-25f4d04a6b30","Type":"ContainerDied","Data":"b8e2c9577bcf3d364c1ddcd4e95c957dd9ee6aa23711e3c4b65bc5fbece9d7b8"} Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.031416 4696 scope.go:117] "RemoveContainer" containerID="a13b8d82e3cd2616dfde2f592ea2b5dd3ccc09efc6afcb34b3f7b86cb82e35e0" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.057711 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-config-data" (OuterVolumeSpecName: "config-data") pod "70c8a4f7-6983-4b1a-a959-25f4d04a6b30" (UID: "70c8a4f7-6983-4b1a-a959-25f4d04a6b30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.065409 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70c8a4f7-6983-4b1a-a959-25f4d04a6b30" (UID: "70c8a4f7-6983-4b1a-a959-25f4d04a6b30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.091551 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "70c8a4f7-6983-4b1a-a959-25f4d04a6b30" (UID: "70c8a4f7-6983-4b1a-a959-25f4d04a6b30"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.095150 4696 scope.go:117] "RemoveContainer" containerID="35f36db869e64037a82ca0f2a3b679d1f0599c24ccad8c79cf42160006183508" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.121500 4696 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.121563 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.121576 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.121589 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn726\" (UniqueName: \"kubernetes.io/projected/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-kube-api-access-vn726\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.121602 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70c8a4f7-6983-4b1a-a959-25f4d04a6b30-logs\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.165199 4696 scope.go:117] "RemoveContainer" containerID="a13b8d82e3cd2616dfde2f592ea2b5dd3ccc09efc6afcb34b3f7b86cb82e35e0" Mar 18 15:59:59 crc kubenswrapper[4696]: E0318 15:59:59.166542 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a13b8d82e3cd2616dfde2f592ea2b5dd3ccc09efc6afcb34b3f7b86cb82e35e0\": container with ID starting with a13b8d82e3cd2616dfde2f592ea2b5dd3ccc09efc6afcb34b3f7b86cb82e35e0 not found: ID does not exist" containerID="a13b8d82e3cd2616dfde2f592ea2b5dd3ccc09efc6afcb34b3f7b86cb82e35e0" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.166614 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a13b8d82e3cd2616dfde2f592ea2b5dd3ccc09efc6afcb34b3f7b86cb82e35e0"} err="failed to get container status \"a13b8d82e3cd2616dfde2f592ea2b5dd3ccc09efc6afcb34b3f7b86cb82e35e0\": rpc error: code = NotFound desc = could not find container \"a13b8d82e3cd2616dfde2f592ea2b5dd3ccc09efc6afcb34b3f7b86cb82e35e0\": container with ID starting with a13b8d82e3cd2616dfde2f592ea2b5dd3ccc09efc6afcb34b3f7b86cb82e35e0 not found: ID does not exist" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.166655 4696 scope.go:117] "RemoveContainer" containerID="35f36db869e64037a82ca0f2a3b679d1f0599c24ccad8c79cf42160006183508" Mar 18 15:59:59 crc kubenswrapper[4696]: E0318 15:59:59.171172 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35f36db869e64037a82ca0f2a3b679d1f0599c24ccad8c79cf42160006183508\": container with ID starting with 35f36db869e64037a82ca0f2a3b679d1f0599c24ccad8c79cf42160006183508 not found: ID does not exist" containerID="35f36db869e64037a82ca0f2a3b679d1f0599c24ccad8c79cf42160006183508" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.171224 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35f36db869e64037a82ca0f2a3b679d1f0599c24ccad8c79cf42160006183508"} err="failed to get container status \"35f36db869e64037a82ca0f2a3b679d1f0599c24ccad8c79cf42160006183508\": rpc error: code = NotFound desc = could not find container \"35f36db869e64037a82ca0f2a3b679d1f0599c24ccad8c79cf42160006183508\": container with ID starting with 35f36db869e64037a82ca0f2a3b679d1f0599c24ccad8c79cf42160006183508 not found: ID does not exist" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.171258 4696 scope.go:117] "RemoveContainer" containerID="a13b8d82e3cd2616dfde2f592ea2b5dd3ccc09efc6afcb34b3f7b86cb82e35e0" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.172629 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a13b8d82e3cd2616dfde2f592ea2b5dd3ccc09efc6afcb34b3f7b86cb82e35e0"} err="failed to get container status \"a13b8d82e3cd2616dfde2f592ea2b5dd3ccc09efc6afcb34b3f7b86cb82e35e0\": rpc error: code = NotFound desc = could not find container \"a13b8d82e3cd2616dfde2f592ea2b5dd3ccc09efc6afcb34b3f7b86cb82e35e0\": container with ID starting with a13b8d82e3cd2616dfde2f592ea2b5dd3ccc09efc6afcb34b3f7b86cb82e35e0 not found: ID does not exist" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.172739 4696 scope.go:117] "RemoveContainer" containerID="35f36db869e64037a82ca0f2a3b679d1f0599c24ccad8c79cf42160006183508" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.173810 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35f36db869e64037a82ca0f2a3b679d1f0599c24ccad8c79cf42160006183508"} err="failed to get container status \"35f36db869e64037a82ca0f2a3b679d1f0599c24ccad8c79cf42160006183508\": rpc error: code = NotFound desc = could not find container \"35f36db869e64037a82ca0f2a3b679d1f0599c24ccad8c79cf42160006183508\": container with ID starting with 35f36db869e64037a82ca0f2a3b679d1f0599c24ccad8c79cf42160006183508 not found: ID does not exist" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.241512 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 15:59:59 crc kubenswrapper[4696]: E0318 15:59:59.307886 4696 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c1a18ae_c261_421e_907b_e3bb372199a2.slice/crio-5b6295e623d9882ee5c1a781ea10e6bca407d26b413bc2def8ab58f02cc88f98.scope\": RecentStats: unable to find data in memory cache]" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.368187 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.398285 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.412631 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.552983 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 15:59:59 crc kubenswrapper[4696]: E0318 15:59:59.555038 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c8a4f7-6983-4b1a-a959-25f4d04a6b30" containerName="nova-metadata-log" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.555065 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c8a4f7-6983-4b1a-a959-25f4d04a6b30" containerName="nova-metadata-log" Mar 18 15:59:59 crc kubenswrapper[4696]: E0318 15:59:59.555112 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b4dcf15-5ff3-4cca-885b-1da4ea205dc2" containerName="nova-scheduler-scheduler" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.555124 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b4dcf15-5ff3-4cca-885b-1da4ea205dc2" containerName="nova-scheduler-scheduler" Mar 18 15:59:59 crc kubenswrapper[4696]: E0318 15:59:59.555158 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c8a4f7-6983-4b1a-a959-25f4d04a6b30" containerName="nova-metadata-metadata" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.555166 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c8a4f7-6983-4b1a-a959-25f4d04a6b30" containerName="nova-metadata-metadata" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.555630 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c8a4f7-6983-4b1a-a959-25f4d04a6b30" containerName="nova-metadata-log" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.555651 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c8a4f7-6983-4b1a-a959-25f4d04a6b30" containerName="nova-metadata-metadata" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.555671 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b4dcf15-5ff3-4cca-885b-1da4ea205dc2" containerName="nova-scheduler-scheduler" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.558220 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.563198 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.563573 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.567003 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.581504 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b4dcf15-5ff3-4cca-885b-1da4ea205dc2-combined-ca-bundle\") pod \"5b4dcf15-5ff3-4cca-885b-1da4ea205dc2\" (UID: \"5b4dcf15-5ff3-4cca-885b-1da4ea205dc2\") " Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.581759 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxv4h\" (UniqueName: \"kubernetes.io/projected/5b4dcf15-5ff3-4cca-885b-1da4ea205dc2-kube-api-access-mxv4h\") pod \"5b4dcf15-5ff3-4cca-885b-1da4ea205dc2\" (UID: \"5b4dcf15-5ff3-4cca-885b-1da4ea205dc2\") " Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.581865 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b4dcf15-5ff3-4cca-885b-1da4ea205dc2-config-data\") pod \"5b4dcf15-5ff3-4cca-885b-1da4ea205dc2\" (UID: \"5b4dcf15-5ff3-4cca-885b-1da4ea205dc2\") " Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.591745 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b4dcf15-5ff3-4cca-885b-1da4ea205dc2-kube-api-access-mxv4h" (OuterVolumeSpecName: "kube-api-access-mxv4h") pod "5b4dcf15-5ff3-4cca-885b-1da4ea205dc2" (UID: "5b4dcf15-5ff3-4cca-885b-1da4ea205dc2"). InnerVolumeSpecName "kube-api-access-mxv4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.625002 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b4dcf15-5ff3-4cca-885b-1da4ea205dc2-config-data" (OuterVolumeSpecName: "config-data") pod "5b4dcf15-5ff3-4cca-885b-1da4ea205dc2" (UID: "5b4dcf15-5ff3-4cca-885b-1da4ea205dc2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.644231 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70c8a4f7-6983-4b1a-a959-25f4d04a6b30" path="/var/lib/kubelet/pods/70c8a4f7-6983-4b1a-a959-25f4d04a6b30/volumes" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.645025 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e93819a-4d0e-4261-a89c-333f1558c5e6" path="/var/lib/kubelet/pods/8e93819a-4d0e-4261-a89c-333f1558c5e6/volumes" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.673099 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b4dcf15-5ff3-4cca-885b-1da4ea205dc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b4dcf15-5ff3-4cca-885b-1da4ea205dc2" (UID: "5b4dcf15-5ff3-4cca-885b-1da4ea205dc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.684955 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/899e251c-2b43-4c26-a918-82e6444d9d21-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"899e251c-2b43-4c26-a918-82e6444d9d21\") " pod="openstack/nova-metadata-0" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.685030 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/899e251c-2b43-4c26-a918-82e6444d9d21-config-data\") pod \"nova-metadata-0\" (UID: \"899e251c-2b43-4c26-a918-82e6444d9d21\") " pod="openstack/nova-metadata-0" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.685087 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/899e251c-2b43-4c26-a918-82e6444d9d21-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"899e251c-2b43-4c26-a918-82e6444d9d21\") " pod="openstack/nova-metadata-0" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.685176 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x77vs\" (UniqueName: \"kubernetes.io/projected/899e251c-2b43-4c26-a918-82e6444d9d21-kube-api-access-x77vs\") pod \"nova-metadata-0\" (UID: \"899e251c-2b43-4c26-a918-82e6444d9d21\") " pod="openstack/nova-metadata-0" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.685204 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/899e251c-2b43-4c26-a918-82e6444d9d21-logs\") pod \"nova-metadata-0\" (UID: \"899e251c-2b43-4c26-a918-82e6444d9d21\") " pod="openstack/nova-metadata-0" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.685282 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b4dcf15-5ff3-4cca-885b-1da4ea205dc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.685297 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxv4h\" (UniqueName: \"kubernetes.io/projected/5b4dcf15-5ff3-4cca-885b-1da4ea205dc2-kube-api-access-mxv4h\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.685314 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b4dcf15-5ff3-4cca-885b-1da4ea205dc2-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.787399 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/899e251c-2b43-4c26-a918-82e6444d9d21-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"899e251c-2b43-4c26-a918-82e6444d9d21\") " pod="openstack/nova-metadata-0" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.787609 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x77vs\" (UniqueName: \"kubernetes.io/projected/899e251c-2b43-4c26-a918-82e6444d9d21-kube-api-access-x77vs\") pod \"nova-metadata-0\" (UID: \"899e251c-2b43-4c26-a918-82e6444d9d21\") " pod="openstack/nova-metadata-0" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.787656 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/899e251c-2b43-4c26-a918-82e6444d9d21-logs\") pod \"nova-metadata-0\" (UID: \"899e251c-2b43-4c26-a918-82e6444d9d21\") " pod="openstack/nova-metadata-0" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.787907 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/899e251c-2b43-4c26-a918-82e6444d9d21-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"899e251c-2b43-4c26-a918-82e6444d9d21\") " pod="openstack/nova-metadata-0" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.787926 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/899e251c-2b43-4c26-a918-82e6444d9d21-config-data\") pod \"nova-metadata-0\" (UID: \"899e251c-2b43-4c26-a918-82e6444d9d21\") " pod="openstack/nova-metadata-0" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.790941 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/899e251c-2b43-4c26-a918-82e6444d9d21-logs\") pod \"nova-metadata-0\" (UID: \"899e251c-2b43-4c26-a918-82e6444d9d21\") " pod="openstack/nova-metadata-0" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.792506 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/899e251c-2b43-4c26-a918-82e6444d9d21-config-data\") pod \"nova-metadata-0\" (UID: \"899e251c-2b43-4c26-a918-82e6444d9d21\") " pod="openstack/nova-metadata-0" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.797378 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/899e251c-2b43-4c26-a918-82e6444d9d21-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"899e251c-2b43-4c26-a918-82e6444d9d21\") " pod="openstack/nova-metadata-0" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.798779 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/899e251c-2b43-4c26-a918-82e6444d9d21-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"899e251c-2b43-4c26-a918-82e6444d9d21\") " pod="openstack/nova-metadata-0" Mar 18 15:59:59 crc kubenswrapper[4696]: I0318 15:59:59.808651 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x77vs\" (UniqueName: \"kubernetes.io/projected/899e251c-2b43-4c26-a918-82e6444d9d21-kube-api-access-x77vs\") pod \"nova-metadata-0\" (UID: \"899e251c-2b43-4c26-a918-82e6444d9d21\") " pod="openstack/nova-metadata-0" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.073276 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.089373 4696 generic.go:334] "Generic (PLEG): container finished" podID="8c1a18ae-c261-421e-907b-e3bb372199a2" containerID="5b6295e623d9882ee5c1a781ea10e6bca407d26b413bc2def8ab58f02cc88f98" exitCode=0 Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.089830 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2cpbx" event={"ID":"8c1a18ae-c261-421e-907b-e3bb372199a2","Type":"ContainerDied","Data":"5b6295e623d9882ee5c1a781ea10e6bca407d26b413bc2def8ab58f02cc88f98"} Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.100650 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1","Type":"ContainerStarted","Data":"c29646643bf825a4cae0479b9418933c08b989807de843cf870b8b77880da0a7"} Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.100719 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1","Type":"ContainerStarted","Data":"3892c73a5179c3262cba4f588ee27ca4b888ae29a3d95e09473f1e5cb36578f5"} Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.100731 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1","Type":"ContainerStarted","Data":"a20933208e1ebbc1dc08353974c6b239fc5ce6a39c2f4af9843fec448a46668a"} Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.106920 4696 generic.go:334] "Generic (PLEG): container finished" podID="5b4dcf15-5ff3-4cca-885b-1da4ea205dc2" containerID="b902172b79cce87298d9be2f6726aba63be2a6ac68752727792985889948c8c3" exitCode=0 Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.107194 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.107776 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5b4dcf15-5ff3-4cca-885b-1da4ea205dc2","Type":"ContainerDied","Data":"b902172b79cce87298d9be2f6726aba63be2a6ac68752727792985889948c8c3"} Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.107849 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5b4dcf15-5ff3-4cca-885b-1da4ea205dc2","Type":"ContainerDied","Data":"dbc916981183c7ed95eb4d8cfd455cfdb4ea17aae532e1d9a3a157d3f8a05ab5"} Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.107880 4696 scope.go:117] "RemoveContainer" containerID="b902172b79cce87298d9be2f6726aba63be2a6ac68752727792985889948c8c3" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.162244 4696 scope.go:117] "RemoveContainer" containerID="b902172b79cce87298d9be2f6726aba63be2a6ac68752727792985889948c8c3" Mar 18 16:00:00 crc kubenswrapper[4696]: E0318 16:00:00.162786 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b902172b79cce87298d9be2f6726aba63be2a6ac68752727792985889948c8c3\": container with ID starting with b902172b79cce87298d9be2f6726aba63be2a6ac68752727792985889948c8c3 not found: ID does not exist" containerID="b902172b79cce87298d9be2f6726aba63be2a6ac68752727792985889948c8c3" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.162827 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b902172b79cce87298d9be2f6726aba63be2a6ac68752727792985889948c8c3"} err="failed to get container status \"b902172b79cce87298d9be2f6726aba63be2a6ac68752727792985889948c8c3\": rpc error: code = NotFound desc = could not find container \"b902172b79cce87298d9be2f6726aba63be2a6ac68752727792985889948c8c3\": container with ID starting with b902172b79cce87298d9be2f6726aba63be2a6ac68752727792985889948c8c3 not found: ID does not exist" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.164658 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564160-dj9vw"] Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.166390 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-dj9vw" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.169846 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.169993 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.177088 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564160-sfqqd"] Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.178657 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564160-sfqqd" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.181813 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.181877 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.182449 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.182432825 podStartE2EDuration="2.182432825s" podCreationTimestamp="2026-03-18 15:59:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:00.161688913 +0000 UTC m=+1443.167863119" watchObservedRunningTime="2026-03-18 16:00:00.182432825 +0000 UTC m=+1443.188607031" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.186419 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.226677 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564160-sfqqd"] Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.240934 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564160-dj9vw"] Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.251618 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.286423 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.297161 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.298717 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.299172 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s68jq\" (UniqueName: \"kubernetes.io/projected/74ca80c8-a4c1-4f87-9e78-5648ca013164-kube-api-access-s68jq\") pod \"auto-csr-approver-29564160-sfqqd\" (UID: \"74ca80c8-a4c1-4f87-9e78-5648ca013164\") " pod="openshift-infra/auto-csr-approver-29564160-sfqqd" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.299228 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/301c9939-0c8a-4b1c-82db-96fbef046ff7-secret-volume\") pod \"collect-profiles-29564160-dj9vw\" (UID: \"301c9939-0c8a-4b1c-82db-96fbef046ff7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-dj9vw" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.299306 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/301c9939-0c8a-4b1c-82db-96fbef046ff7-config-volume\") pod \"collect-profiles-29564160-dj9vw\" (UID: \"301c9939-0c8a-4b1c-82db-96fbef046ff7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-dj9vw" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.299427 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klrh9\" (UniqueName: \"kubernetes.io/projected/301c9939-0c8a-4b1c-82db-96fbef046ff7-kube-api-access-klrh9\") pod \"collect-profiles-29564160-dj9vw\" (UID: \"301c9939-0c8a-4b1c-82db-96fbef046ff7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-dj9vw" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.302744 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.331125 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.402941 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30a10886-71b7-48b8-af79-859f4c134f28-config-data\") pod \"nova-scheduler-0\" (UID: \"30a10886-71b7-48b8-af79-859f4c134f28\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.403372 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdlmp\" (UniqueName: \"kubernetes.io/projected/30a10886-71b7-48b8-af79-859f4c134f28-kube-api-access-sdlmp\") pod \"nova-scheduler-0\" (UID: \"30a10886-71b7-48b8-af79-859f4c134f28\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.403411 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klrh9\" (UniqueName: \"kubernetes.io/projected/301c9939-0c8a-4b1c-82db-96fbef046ff7-kube-api-access-klrh9\") pod \"collect-profiles-29564160-dj9vw\" (UID: \"301c9939-0c8a-4b1c-82db-96fbef046ff7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-dj9vw" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.403491 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s68jq\" (UniqueName: \"kubernetes.io/projected/74ca80c8-a4c1-4f87-9e78-5648ca013164-kube-api-access-s68jq\") pod \"auto-csr-approver-29564160-sfqqd\" (UID: \"74ca80c8-a4c1-4f87-9e78-5648ca013164\") " pod="openshift-infra/auto-csr-approver-29564160-sfqqd" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.403509 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/301c9939-0c8a-4b1c-82db-96fbef046ff7-secret-volume\") pod \"collect-profiles-29564160-dj9vw\" (UID: \"301c9939-0c8a-4b1c-82db-96fbef046ff7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-dj9vw" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.403567 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30a10886-71b7-48b8-af79-859f4c134f28-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"30a10886-71b7-48b8-af79-859f4c134f28\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.403607 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/301c9939-0c8a-4b1c-82db-96fbef046ff7-config-volume\") pod \"collect-profiles-29564160-dj9vw\" (UID: \"301c9939-0c8a-4b1c-82db-96fbef046ff7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-dj9vw" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.404755 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/301c9939-0c8a-4b1c-82db-96fbef046ff7-config-volume\") pod \"collect-profiles-29564160-dj9vw\" (UID: \"301c9939-0c8a-4b1c-82db-96fbef046ff7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-dj9vw" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.418772 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/301c9939-0c8a-4b1c-82db-96fbef046ff7-secret-volume\") pod \"collect-profiles-29564160-dj9vw\" (UID: \"301c9939-0c8a-4b1c-82db-96fbef046ff7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-dj9vw" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.442368 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klrh9\" (UniqueName: \"kubernetes.io/projected/301c9939-0c8a-4b1c-82db-96fbef046ff7-kube-api-access-klrh9\") pod \"collect-profiles-29564160-dj9vw\" (UID: \"301c9939-0c8a-4b1c-82db-96fbef046ff7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-dj9vw" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.471813 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s68jq\" (UniqueName: \"kubernetes.io/projected/74ca80c8-a4c1-4f87-9e78-5648ca013164-kube-api-access-s68jq\") pod \"auto-csr-approver-29564160-sfqqd\" (UID: \"74ca80c8-a4c1-4f87-9e78-5648ca013164\") " pod="openshift-infra/auto-csr-approver-29564160-sfqqd" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.494790 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-dj9vw" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.529394 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30a10886-71b7-48b8-af79-859f4c134f28-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"30a10886-71b7-48b8-af79-859f4c134f28\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.530125 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30a10886-71b7-48b8-af79-859f4c134f28-config-data\") pod \"nova-scheduler-0\" (UID: \"30a10886-71b7-48b8-af79-859f4c134f28\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.530494 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdlmp\" (UniqueName: \"kubernetes.io/projected/30a10886-71b7-48b8-af79-859f4c134f28-kube-api-access-sdlmp\") pod \"nova-scheduler-0\" (UID: \"30a10886-71b7-48b8-af79-859f4c134f28\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.531839 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564160-sfqqd" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.589394 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30a10886-71b7-48b8-af79-859f4c134f28-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"30a10886-71b7-48b8-af79-859f4c134f28\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.597096 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30a10886-71b7-48b8-af79-859f4c134f28-config-data\") pod \"nova-scheduler-0\" (UID: \"30a10886-71b7-48b8-af79-859f4c134f28\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.602085 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdlmp\" (UniqueName: \"kubernetes.io/projected/30a10886-71b7-48b8-af79-859f4c134f28-kube-api-access-sdlmp\") pod \"nova-scheduler-0\" (UID: \"30a10886-71b7-48b8-af79-859f4c134f28\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.618501 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:00:00 crc kubenswrapper[4696]: I0318 16:00:00.909327 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:00:01 crc kubenswrapper[4696]: W0318 16:00:01.154794 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod301c9939_0c8a_4b1c_82db_96fbef046ff7.slice/crio-e56ab8529c30be5bcd768e6201c11abfef65db98cba63662b71dbc97eebbf7e1 WatchSource:0}: Error finding container e56ab8529c30be5bcd768e6201c11abfef65db98cba63662b71dbc97eebbf7e1: Status 404 returned error can't find the container with id e56ab8529c30be5bcd768e6201c11abfef65db98cba63662b71dbc97eebbf7e1 Mar 18 16:00:01 crc kubenswrapper[4696]: I0318 16:00:01.155880 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"899e251c-2b43-4c26-a918-82e6444d9d21","Type":"ContainerStarted","Data":"583e2d8265ba730d41af1afcd90ca447224b63958b302741a9bfe84f3e92cd93"} Mar 18 16:00:01 crc kubenswrapper[4696]: I0318 16:00:01.156204 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"899e251c-2b43-4c26-a918-82e6444d9d21","Type":"ContainerStarted","Data":"68d2b6a6d953017771001e7a9399f9236f31eabbe3976d23c8ed56f869f81346"} Mar 18 16:00:01 crc kubenswrapper[4696]: I0318 16:00:01.157724 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564160-sfqqd"] Mar 18 16:00:01 crc kubenswrapper[4696]: W0318 16:00:01.161328 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74ca80c8_a4c1_4f87_9e78_5648ca013164.slice/crio-ada21783a02e0baa5e1d612e57b9e65919427a51454df740f1fe80b63d998053 WatchSource:0}: Error finding container ada21783a02e0baa5e1d612e57b9e65919427a51454df740f1fe80b63d998053: Status 404 returned error can't find the container with id ada21783a02e0baa5e1d612e57b9e65919427a51454df740f1fe80b63d998053 Mar 18 16:00:01 crc kubenswrapper[4696]: I0318 16:00:01.175109 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564160-dj9vw"] Mar 18 16:00:01 crc kubenswrapper[4696]: I0318 16:00:01.366228 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:00:01 crc kubenswrapper[4696]: I0318 16:00:01.618211 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b4dcf15-5ff3-4cca-885b-1da4ea205dc2" path="/var/lib/kubelet/pods/5b4dcf15-5ff3-4cca-885b-1da4ea205dc2/volumes" Mar 18 16:00:01 crc kubenswrapper[4696]: I0318 16:00:01.625762 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2cpbx" Mar 18 16:00:01 crc kubenswrapper[4696]: I0318 16:00:01.711490 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxv5g\" (UniqueName: \"kubernetes.io/projected/8c1a18ae-c261-421e-907b-e3bb372199a2-kube-api-access-gxv5g\") pod \"8c1a18ae-c261-421e-907b-e3bb372199a2\" (UID: \"8c1a18ae-c261-421e-907b-e3bb372199a2\") " Mar 18 16:00:01 crc kubenswrapper[4696]: I0318 16:00:01.711686 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c1a18ae-c261-421e-907b-e3bb372199a2-config-data\") pod \"8c1a18ae-c261-421e-907b-e3bb372199a2\" (UID: \"8c1a18ae-c261-421e-907b-e3bb372199a2\") " Mar 18 16:00:01 crc kubenswrapper[4696]: I0318 16:00:01.711739 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1a18ae-c261-421e-907b-e3bb372199a2-combined-ca-bundle\") pod \"8c1a18ae-c261-421e-907b-e3bb372199a2\" (UID: \"8c1a18ae-c261-421e-907b-e3bb372199a2\") " Mar 18 16:00:01 crc kubenswrapper[4696]: I0318 16:00:01.711911 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c1a18ae-c261-421e-907b-e3bb372199a2-scripts\") pod \"8c1a18ae-c261-421e-907b-e3bb372199a2\" (UID: \"8c1a18ae-c261-421e-907b-e3bb372199a2\") " Mar 18 16:00:01 crc kubenswrapper[4696]: I0318 16:00:01.724807 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c1a18ae-c261-421e-907b-e3bb372199a2-scripts" (OuterVolumeSpecName: "scripts") pod "8c1a18ae-c261-421e-907b-e3bb372199a2" (UID: "8c1a18ae-c261-421e-907b-e3bb372199a2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:01 crc kubenswrapper[4696]: I0318 16:00:01.725512 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c1a18ae-c261-421e-907b-e3bb372199a2-kube-api-access-gxv5g" (OuterVolumeSpecName: "kube-api-access-gxv5g") pod "8c1a18ae-c261-421e-907b-e3bb372199a2" (UID: "8c1a18ae-c261-421e-907b-e3bb372199a2"). InnerVolumeSpecName "kube-api-access-gxv5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:01 crc kubenswrapper[4696]: I0318 16:00:01.766984 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c1a18ae-c261-421e-907b-e3bb372199a2-config-data" (OuterVolumeSpecName: "config-data") pod "8c1a18ae-c261-421e-907b-e3bb372199a2" (UID: "8c1a18ae-c261-421e-907b-e3bb372199a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:01 crc kubenswrapper[4696]: I0318 16:00:01.782342 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c1a18ae-c261-421e-907b-e3bb372199a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c1a18ae-c261-421e-907b-e3bb372199a2" (UID: "8c1a18ae-c261-421e-907b-e3bb372199a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:01 crc kubenswrapper[4696]: I0318 16:00:01.816328 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c1a18ae-c261-421e-907b-e3bb372199a2-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:01 crc kubenswrapper[4696]: I0318 16:00:01.816372 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxv5g\" (UniqueName: \"kubernetes.io/projected/8c1a18ae-c261-421e-907b-e3bb372199a2-kube-api-access-gxv5g\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:01 crc kubenswrapper[4696]: I0318 16:00:01.816385 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c1a18ae-c261-421e-907b-e3bb372199a2-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:01 crc kubenswrapper[4696]: I0318 16:00:01.816398 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1a18ae-c261-421e-907b-e3bb372199a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:02 crc kubenswrapper[4696]: I0318 16:00:02.175854 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30a10886-71b7-48b8-af79-859f4c134f28","Type":"ContainerStarted","Data":"81814b15831039d10d39091be6c53a085189aae74daf95ac1cd21d2a16886774"} Mar 18 16:00:02 crc kubenswrapper[4696]: I0318 16:00:02.184839 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564160-sfqqd" event={"ID":"74ca80c8-a4c1-4f87-9e78-5648ca013164","Type":"ContainerStarted","Data":"ada21783a02e0baa5e1d612e57b9e65919427a51454df740f1fe80b63d998053"} Mar 18 16:00:02 crc kubenswrapper[4696]: I0318 16:00:02.187321 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2cpbx" Mar 18 16:00:02 crc kubenswrapper[4696]: I0318 16:00:02.187339 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2cpbx" event={"ID":"8c1a18ae-c261-421e-907b-e3bb372199a2","Type":"ContainerDied","Data":"07a759efca230d60b28d43e4356e6c2bde4ee5c4ec19228fcf757e698af1980a"} Mar 18 16:00:02 crc kubenswrapper[4696]: I0318 16:00:02.187368 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07a759efca230d60b28d43e4356e6c2bde4ee5c4ec19228fcf757e698af1980a" Mar 18 16:00:02 crc kubenswrapper[4696]: I0318 16:00:02.192507 4696 generic.go:334] "Generic (PLEG): container finished" podID="301c9939-0c8a-4b1c-82db-96fbef046ff7" containerID="ea00a712d0a04b47e3fd09162a7983c453750d0a13341d29532e32fcfbeb7d80" exitCode=0 Mar 18 16:00:02 crc kubenswrapper[4696]: I0318 16:00:02.192595 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-dj9vw" event={"ID":"301c9939-0c8a-4b1c-82db-96fbef046ff7","Type":"ContainerDied","Data":"ea00a712d0a04b47e3fd09162a7983c453750d0a13341d29532e32fcfbeb7d80"} Mar 18 16:00:02 crc kubenswrapper[4696]: I0318 16:00:02.192640 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-dj9vw" event={"ID":"301c9939-0c8a-4b1c-82db-96fbef046ff7","Type":"ContainerStarted","Data":"e56ab8529c30be5bcd768e6201c11abfef65db98cba63662b71dbc97eebbf7e1"} Mar 18 16:00:02 crc kubenswrapper[4696]: I0318 16:00:02.228466 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 16:00:02 crc kubenswrapper[4696]: E0318 16:00:02.229439 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c1a18ae-c261-421e-907b-e3bb372199a2" containerName="nova-cell1-conductor-db-sync" Mar 18 16:00:02 crc kubenswrapper[4696]: I0318 16:00:02.229562 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c1a18ae-c261-421e-907b-e3bb372199a2" containerName="nova-cell1-conductor-db-sync" Mar 18 16:00:02 crc kubenswrapper[4696]: I0318 16:00:02.229956 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c1a18ae-c261-421e-907b-e3bb372199a2" containerName="nova-cell1-conductor-db-sync" Mar 18 16:00:02 crc kubenswrapper[4696]: I0318 16:00:02.231243 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:02 crc kubenswrapper[4696]: I0318 16:00:02.236997 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 16:00:02 crc kubenswrapper[4696]: I0318 16:00:02.241577 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 16:00:02 crc kubenswrapper[4696]: I0318 16:00:02.328826 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69cc8eab-a88e-49ce-830b-9e352aea0d5f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"69cc8eab-a88e-49ce-830b-9e352aea0d5f\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:02 crc kubenswrapper[4696]: I0318 16:00:02.328926 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdc8q\" (UniqueName: \"kubernetes.io/projected/69cc8eab-a88e-49ce-830b-9e352aea0d5f-kube-api-access-wdc8q\") pod \"nova-cell1-conductor-0\" (UID: \"69cc8eab-a88e-49ce-830b-9e352aea0d5f\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:02 crc kubenswrapper[4696]: I0318 16:00:02.329062 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69cc8eab-a88e-49ce-830b-9e352aea0d5f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"69cc8eab-a88e-49ce-830b-9e352aea0d5f\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:02 crc kubenswrapper[4696]: I0318 16:00:02.432012 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69cc8eab-a88e-49ce-830b-9e352aea0d5f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"69cc8eab-a88e-49ce-830b-9e352aea0d5f\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:02 crc kubenswrapper[4696]: I0318 16:00:02.432150 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdc8q\" (UniqueName: \"kubernetes.io/projected/69cc8eab-a88e-49ce-830b-9e352aea0d5f-kube-api-access-wdc8q\") pod \"nova-cell1-conductor-0\" (UID: \"69cc8eab-a88e-49ce-830b-9e352aea0d5f\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:02 crc kubenswrapper[4696]: I0318 16:00:02.432346 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69cc8eab-a88e-49ce-830b-9e352aea0d5f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"69cc8eab-a88e-49ce-830b-9e352aea0d5f\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:02 crc kubenswrapper[4696]: I0318 16:00:02.439346 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69cc8eab-a88e-49ce-830b-9e352aea0d5f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"69cc8eab-a88e-49ce-830b-9e352aea0d5f\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:02 crc kubenswrapper[4696]: I0318 16:00:02.441486 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69cc8eab-a88e-49ce-830b-9e352aea0d5f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"69cc8eab-a88e-49ce-830b-9e352aea0d5f\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:02 crc kubenswrapper[4696]: I0318 16:00:02.455535 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdc8q\" (UniqueName: \"kubernetes.io/projected/69cc8eab-a88e-49ce-830b-9e352aea0d5f-kube-api-access-wdc8q\") pod \"nova-cell1-conductor-0\" (UID: \"69cc8eab-a88e-49ce-830b-9e352aea0d5f\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:02 crc kubenswrapper[4696]: I0318 16:00:02.564654 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:03 crc kubenswrapper[4696]: W0318 16:00:03.082658 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69cc8eab_a88e_49ce_830b_9e352aea0d5f.slice/crio-a3df8da8347d04fcd69a9912d4752b7595dea85c636c64dcf82375b5b6424767 WatchSource:0}: Error finding container a3df8da8347d04fcd69a9912d4752b7595dea85c636c64dcf82375b5b6424767: Status 404 returned error can't find the container with id a3df8da8347d04fcd69a9912d4752b7595dea85c636c64dcf82375b5b6424767 Mar 18 16:00:03 crc kubenswrapper[4696]: I0318 16:00:03.093551 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 16:00:03 crc kubenswrapper[4696]: I0318 16:00:03.145338 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 16:00:03 crc kubenswrapper[4696]: I0318 16:00:03.220406 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"69cc8eab-a88e-49ce-830b-9e352aea0d5f","Type":"ContainerStarted","Data":"a3df8da8347d04fcd69a9912d4752b7595dea85c636c64dcf82375b5b6424767"} Mar 18 16:00:03 crc kubenswrapper[4696]: I0318 16:00:03.234872 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"899e251c-2b43-4c26-a918-82e6444d9d21","Type":"ContainerStarted","Data":"9283a323074cee9e3e1d05138f65030d145f0263eccdf15120558340835a18b2"} Mar 18 16:00:03 crc kubenswrapper[4696]: I0318 16:00:03.238046 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30a10886-71b7-48b8-af79-859f4c134f28","Type":"ContainerStarted","Data":"4c2f74c9956957e3d1950d932caab7d355c51e0b051c1e1863670372e3daf7c0"} Mar 18 16:00:03 crc kubenswrapper[4696]: I0318 16:00:03.687955 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-dj9vw" Mar 18 16:00:03 crc kubenswrapper[4696]: I0318 16:00:03.762898 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/301c9939-0c8a-4b1c-82db-96fbef046ff7-secret-volume\") pod \"301c9939-0c8a-4b1c-82db-96fbef046ff7\" (UID: \"301c9939-0c8a-4b1c-82db-96fbef046ff7\") " Mar 18 16:00:03 crc kubenswrapper[4696]: I0318 16:00:03.762989 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klrh9\" (UniqueName: \"kubernetes.io/projected/301c9939-0c8a-4b1c-82db-96fbef046ff7-kube-api-access-klrh9\") pod \"301c9939-0c8a-4b1c-82db-96fbef046ff7\" (UID: \"301c9939-0c8a-4b1c-82db-96fbef046ff7\") " Mar 18 16:00:03 crc kubenswrapper[4696]: I0318 16:00:03.763053 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/301c9939-0c8a-4b1c-82db-96fbef046ff7-config-volume\") pod \"301c9939-0c8a-4b1c-82db-96fbef046ff7\" (UID: \"301c9939-0c8a-4b1c-82db-96fbef046ff7\") " Mar 18 16:00:03 crc kubenswrapper[4696]: I0318 16:00:03.767260 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/301c9939-0c8a-4b1c-82db-96fbef046ff7-config-volume" (OuterVolumeSpecName: "config-volume") pod "301c9939-0c8a-4b1c-82db-96fbef046ff7" (UID: "301c9939-0c8a-4b1c-82db-96fbef046ff7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:03 crc kubenswrapper[4696]: I0318 16:00:03.770423 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/301c9939-0c8a-4b1c-82db-96fbef046ff7-kube-api-access-klrh9" (OuterVolumeSpecName: "kube-api-access-klrh9") pod "301c9939-0c8a-4b1c-82db-96fbef046ff7" (UID: "301c9939-0c8a-4b1c-82db-96fbef046ff7"). InnerVolumeSpecName "kube-api-access-klrh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:03 crc kubenswrapper[4696]: I0318 16:00:03.771035 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/301c9939-0c8a-4b1c-82db-96fbef046ff7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "301c9939-0c8a-4b1c-82db-96fbef046ff7" (UID: "301c9939-0c8a-4b1c-82db-96fbef046ff7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:03 crc kubenswrapper[4696]: I0318 16:00:03.866487 4696 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/301c9939-0c8a-4b1c-82db-96fbef046ff7-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:03 crc kubenswrapper[4696]: I0318 16:00:03.866839 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klrh9\" (UniqueName: \"kubernetes.io/projected/301c9939-0c8a-4b1c-82db-96fbef046ff7-kube-api-access-klrh9\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:03 crc kubenswrapper[4696]: I0318 16:00:03.866909 4696 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/301c9939-0c8a-4b1c-82db-96fbef046ff7-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:04 crc kubenswrapper[4696]: I0318 16:00:04.250341 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-dj9vw" event={"ID":"301c9939-0c8a-4b1c-82db-96fbef046ff7","Type":"ContainerDied","Data":"e56ab8529c30be5bcd768e6201c11abfef65db98cba63662b71dbc97eebbf7e1"} Mar 18 16:00:04 crc kubenswrapper[4696]: I0318 16:00:04.251185 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e56ab8529c30be5bcd768e6201c11abfef65db98cba63662b71dbc97eebbf7e1" Mar 18 16:00:04 crc kubenswrapper[4696]: I0318 16:00:04.250698 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-dj9vw" Mar 18 16:00:04 crc kubenswrapper[4696]: I0318 16:00:04.254466 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"69cc8eab-a88e-49ce-830b-9e352aea0d5f","Type":"ContainerStarted","Data":"dfd375286401ba64d57077e517c1065b47a1bc6de8ebe9066986e82fb796925c"} Mar 18 16:00:04 crc kubenswrapper[4696]: I0318 16:00:04.254619 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:04 crc kubenswrapper[4696]: I0318 16:00:04.292963 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=5.292928177 podStartE2EDuration="5.292928177s" podCreationTimestamp="2026-03-18 15:59:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:04.272077992 +0000 UTC m=+1447.278252208" watchObservedRunningTime="2026-03-18 16:00:04.292928177 +0000 UTC m=+1447.299102423" Mar 18 16:00:04 crc kubenswrapper[4696]: I0318 16:00:04.320599 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.320489231 podStartE2EDuration="2.320489231s" podCreationTimestamp="2026-03-18 16:00:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:04.301052772 +0000 UTC m=+1447.307226998" watchObservedRunningTime="2026-03-18 16:00:04.320489231 +0000 UTC m=+1447.326663457" Mar 18 16:00:04 crc kubenswrapper[4696]: I0318 16:00:04.357701 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=4.357678227 podStartE2EDuration="4.357678227s" podCreationTimestamp="2026-03-18 16:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:04.338566606 +0000 UTC m=+1447.344740812" watchObservedRunningTime="2026-03-18 16:00:04.357678227 +0000 UTC m=+1447.363852433" Mar 18 16:00:05 crc kubenswrapper[4696]: I0318 16:00:05.620218 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 16:00:08 crc kubenswrapper[4696]: I0318 16:00:08.681587 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 16:00:08 crc kubenswrapper[4696]: I0318 16:00:08.681989 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 16:00:09 crc kubenswrapper[4696]: I0318 16:00:09.764860 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:00:09 crc kubenswrapper[4696]: I0318 16:00:09.764870 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:00:10 crc kubenswrapper[4696]: I0318 16:00:10.074296 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 16:00:10 crc kubenswrapper[4696]: I0318 16:00:10.075203 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 16:00:10 crc kubenswrapper[4696]: I0318 16:00:10.619903 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 16:00:10 crc kubenswrapper[4696]: I0318 16:00:10.654153 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 16:00:11 crc kubenswrapper[4696]: I0318 16:00:11.092089 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="899e251c-2b43-4c26-a918-82e6444d9d21" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:00:11 crc kubenswrapper[4696]: I0318 16:00:11.092104 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="899e251c-2b43-4c26-a918-82e6444d9d21" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:00:11 crc kubenswrapper[4696]: I0318 16:00:11.360697 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 16:00:12 crc kubenswrapper[4696]: I0318 16:00:12.607504 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:16 crc kubenswrapper[4696]: I0318 16:00:16.681419 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 16:00:16 crc kubenswrapper[4696]: I0318 16:00:16.681731 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 16:00:17 crc kubenswrapper[4696]: I0318 16:00:17.021027 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 16:00:18 crc kubenswrapper[4696]: I0318 16:00:18.073779 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 16:00:18 crc kubenswrapper[4696]: I0318 16:00:18.073850 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 16:00:18 crc kubenswrapper[4696]: I0318 16:00:18.685638 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 16:00:18 crc kubenswrapper[4696]: I0318 16:00:18.686596 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 16:00:18 crc kubenswrapper[4696]: I0318 16:00:18.688594 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 16:00:19 crc kubenswrapper[4696]: I0318 16:00:19.404348 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 16:00:19 crc kubenswrapper[4696]: I0318 16:00:19.645988 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-jxj45"] Mar 18 16:00:19 crc kubenswrapper[4696]: E0318 16:00:19.646723 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="301c9939-0c8a-4b1c-82db-96fbef046ff7" containerName="collect-profiles" Mar 18 16:00:19 crc kubenswrapper[4696]: I0318 16:00:19.646749 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="301c9939-0c8a-4b1c-82db-96fbef046ff7" containerName="collect-profiles" Mar 18 16:00:19 crc kubenswrapper[4696]: I0318 16:00:19.647048 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="301c9939-0c8a-4b1c-82db-96fbef046ff7" containerName="collect-profiles" Mar 18 16:00:19 crc kubenswrapper[4696]: I0318 16:00:19.649187 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" Mar 18 16:00:19 crc kubenswrapper[4696]: I0318 16:00:19.655681 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-jxj45"] Mar 18 16:00:19 crc kubenswrapper[4696]: I0318 16:00:19.707720 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-jxj45\" (UID: \"a024523d-d753-4a81-a8e5-2b416559d14f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" Mar 18 16:00:19 crc kubenswrapper[4696]: I0318 16:00:19.708102 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-config\") pod \"dnsmasq-dns-cd5cbd7b9-jxj45\" (UID: \"a024523d-d753-4a81-a8e5-2b416559d14f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" Mar 18 16:00:19 crc kubenswrapper[4696]: I0318 16:00:19.708163 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-jxj45\" (UID: \"a024523d-d753-4a81-a8e5-2b416559d14f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" Mar 18 16:00:19 crc kubenswrapper[4696]: I0318 16:00:19.708234 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq28k\" (UniqueName: \"kubernetes.io/projected/a024523d-d753-4a81-a8e5-2b416559d14f-kube-api-access-pq28k\") pod \"dnsmasq-dns-cd5cbd7b9-jxj45\" (UID: \"a024523d-d753-4a81-a8e5-2b416559d14f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" Mar 18 16:00:19 crc kubenswrapper[4696]: I0318 16:00:19.708269 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-jxj45\" (UID: \"a024523d-d753-4a81-a8e5-2b416559d14f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" Mar 18 16:00:19 crc kubenswrapper[4696]: I0318 16:00:19.708785 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-jxj45\" (UID: \"a024523d-d753-4a81-a8e5-2b416559d14f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" Mar 18 16:00:19 crc kubenswrapper[4696]: I0318 16:00:19.810697 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-jxj45\" (UID: \"a024523d-d753-4a81-a8e5-2b416559d14f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" Mar 18 16:00:19 crc kubenswrapper[4696]: I0318 16:00:19.810908 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-jxj45\" (UID: \"a024523d-d753-4a81-a8e5-2b416559d14f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" Mar 18 16:00:19 crc kubenswrapper[4696]: I0318 16:00:19.810952 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-config\") pod \"dnsmasq-dns-cd5cbd7b9-jxj45\" (UID: \"a024523d-d753-4a81-a8e5-2b416559d14f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" Mar 18 16:00:19 crc kubenswrapper[4696]: I0318 16:00:19.811005 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-jxj45\" (UID: \"a024523d-d753-4a81-a8e5-2b416559d14f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" Mar 18 16:00:19 crc kubenswrapper[4696]: I0318 16:00:19.811058 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq28k\" (UniqueName: \"kubernetes.io/projected/a024523d-d753-4a81-a8e5-2b416559d14f-kube-api-access-pq28k\") pod \"dnsmasq-dns-cd5cbd7b9-jxj45\" (UID: \"a024523d-d753-4a81-a8e5-2b416559d14f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" Mar 18 16:00:19 crc kubenswrapper[4696]: I0318 16:00:19.811087 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-jxj45\" (UID: \"a024523d-d753-4a81-a8e5-2b416559d14f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" Mar 18 16:00:19 crc kubenswrapper[4696]: I0318 16:00:19.812391 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-jxj45\" (UID: \"a024523d-d753-4a81-a8e5-2b416559d14f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" Mar 18 16:00:19 crc kubenswrapper[4696]: I0318 16:00:19.813301 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-jxj45\" (UID: \"a024523d-d753-4a81-a8e5-2b416559d14f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" Mar 18 16:00:19 crc kubenswrapper[4696]: I0318 16:00:19.813467 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-config\") pod \"dnsmasq-dns-cd5cbd7b9-jxj45\" (UID: \"a024523d-d753-4a81-a8e5-2b416559d14f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" Mar 18 16:00:19 crc kubenswrapper[4696]: I0318 16:00:19.813955 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-jxj45\" (UID: \"a024523d-d753-4a81-a8e5-2b416559d14f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" Mar 18 16:00:19 crc kubenswrapper[4696]: I0318 16:00:19.817923 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-jxj45\" (UID: \"a024523d-d753-4a81-a8e5-2b416559d14f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" Mar 18 16:00:19 crc kubenswrapper[4696]: I0318 16:00:19.832771 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq28k\" (UniqueName: \"kubernetes.io/projected/a024523d-d753-4a81-a8e5-2b416559d14f-kube-api-access-pq28k\") pod \"dnsmasq-dns-cd5cbd7b9-jxj45\" (UID: \"a024523d-d753-4a81-a8e5-2b416559d14f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" Mar 18 16:00:20 crc kubenswrapper[4696]: I0318 16:00:20.016507 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" Mar 18 16:00:20 crc kubenswrapper[4696]: I0318 16:00:20.082613 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 16:00:20 crc kubenswrapper[4696]: I0318 16:00:20.087386 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 16:00:20 crc kubenswrapper[4696]: I0318 16:00:20.111377 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 16:00:20 crc kubenswrapper[4696]: I0318 16:00:20.414860 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 16:00:21 crc kubenswrapper[4696]: I0318 16:00:21.061986 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:21 crc kubenswrapper[4696]: I0318 16:00:21.062688 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bdc7971b-ce0d-490b-b046-7be30738505a" containerName="ceilometer-central-agent" containerID="cri-o://953ae1a9222abd277f53185b46363be0b7b2e7b36397ee31fe0c6140f9e71e53" gracePeriod=30 Mar 18 16:00:21 crc kubenswrapper[4696]: I0318 16:00:21.062786 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bdc7971b-ce0d-490b-b046-7be30738505a" containerName="proxy-httpd" containerID="cri-o://3946083d2400a1e43d9cf4a178ed0f6bdf05b03c04b7eb34290bef7a358a53a1" gracePeriod=30 Mar 18 16:00:21 crc kubenswrapper[4696]: I0318 16:00:21.062841 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bdc7971b-ce0d-490b-b046-7be30738505a" containerName="ceilometer-notification-agent" containerID="cri-o://3589742346c6aebfd1e9e14560ed9c396299d74a20ca99b7d428c654123a9b28" gracePeriod=30 Mar 18 16:00:21 crc kubenswrapper[4696]: I0318 16:00:21.062793 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bdc7971b-ce0d-490b-b046-7be30738505a" containerName="sg-core" containerID="cri-o://a01df05cf9c30d3fe815a40c748f360fc726e738754b228dd838e8759b3e72c2" gracePeriod=30 Mar 18 16:00:21 crc kubenswrapper[4696]: W0318 16:00:21.250537 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda024523d_d753_4a81_a8e5_2b416559d14f.slice/crio-bf76b2109dddc178c34afe1cd4c7304a087d90d672ced243f59bcabfccf1070f WatchSource:0}: Error finding container bf76b2109dddc178c34afe1cd4c7304a087d90d672ced243f59bcabfccf1070f: Status 404 returned error can't find the container with id bf76b2109dddc178c34afe1cd4c7304a087d90d672ced243f59bcabfccf1070f Mar 18 16:00:21 crc kubenswrapper[4696]: I0318 16:00:21.253742 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-jxj45"] Mar 18 16:00:21 crc kubenswrapper[4696]: I0318 16:00:21.423385 4696 generic.go:334] "Generic (PLEG): container finished" podID="bdc7971b-ce0d-490b-b046-7be30738505a" containerID="3946083d2400a1e43d9cf4a178ed0f6bdf05b03c04b7eb34290bef7a358a53a1" exitCode=0 Mar 18 16:00:21 crc kubenswrapper[4696]: I0318 16:00:21.423623 4696 generic.go:334] "Generic (PLEG): container finished" podID="bdc7971b-ce0d-490b-b046-7be30738505a" containerID="a01df05cf9c30d3fe815a40c748f360fc726e738754b228dd838e8759b3e72c2" exitCode=2 Mar 18 16:00:21 crc kubenswrapper[4696]: I0318 16:00:21.423560 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdc7971b-ce0d-490b-b046-7be30738505a","Type":"ContainerDied","Data":"3946083d2400a1e43d9cf4a178ed0f6bdf05b03c04b7eb34290bef7a358a53a1"} Mar 18 16:00:21 crc kubenswrapper[4696]: I0318 16:00:21.423815 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdc7971b-ce0d-490b-b046-7be30738505a","Type":"ContainerDied","Data":"a01df05cf9c30d3fe815a40c748f360fc726e738754b228dd838e8759b3e72c2"} Mar 18 16:00:21 crc kubenswrapper[4696]: I0318 16:00:21.425688 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" event={"ID":"a024523d-d753-4a81-a8e5-2b416559d14f","Type":"ContainerStarted","Data":"bf76b2109dddc178c34afe1cd4c7304a087d90d672ced243f59bcabfccf1070f"} Mar 18 16:00:22 crc kubenswrapper[4696]: I0318 16:00:22.129004 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:00:22 crc kubenswrapper[4696]: I0318 16:00:22.437610 4696 generic.go:334] "Generic (PLEG): container finished" podID="a024523d-d753-4a81-a8e5-2b416559d14f" containerID="f72d9cbecb9c09ac23c1cb02a92aa449ab1d0d3d3b2069c25d80ad1f34364e22" exitCode=0 Mar 18 16:00:22 crc kubenswrapper[4696]: I0318 16:00:22.437683 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" event={"ID":"a024523d-d753-4a81-a8e5-2b416559d14f","Type":"ContainerDied","Data":"f72d9cbecb9c09ac23c1cb02a92aa449ab1d0d3d3b2069c25d80ad1f34364e22"} Mar 18 16:00:22 crc kubenswrapper[4696]: I0318 16:00:22.441609 4696 generic.go:334] "Generic (PLEG): container finished" podID="74ca80c8-a4c1-4f87-9e78-5648ca013164" containerID="857d54e8e1a459e51f8e42f236e58dd773b4b9b1cb4bc6ce7c39f8b55bc22fd4" exitCode=0 Mar 18 16:00:22 crc kubenswrapper[4696]: I0318 16:00:22.441662 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564160-sfqqd" event={"ID":"74ca80c8-a4c1-4f87-9e78-5648ca013164","Type":"ContainerDied","Data":"857d54e8e1a459e51f8e42f236e58dd773b4b9b1cb4bc6ce7c39f8b55bc22fd4"} Mar 18 16:00:22 crc kubenswrapper[4696]: I0318 16:00:22.459291 4696 generic.go:334] "Generic (PLEG): container finished" podID="bdc7971b-ce0d-490b-b046-7be30738505a" containerID="953ae1a9222abd277f53185b46363be0b7b2e7b36397ee31fe0c6140f9e71e53" exitCode=0 Mar 18 16:00:22 crc kubenswrapper[4696]: I0318 16:00:22.459546 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdc7971b-ce0d-490b-b046-7be30738505a","Type":"ContainerDied","Data":"953ae1a9222abd277f53185b46363be0b7b2e7b36397ee31fe0c6140f9e71e53"} Mar 18 16:00:22 crc kubenswrapper[4696]: I0318 16:00:22.459625 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1" containerName="nova-api-log" containerID="cri-o://3892c73a5179c3262cba4f588ee27ca4b888ae29a3d95e09473f1e5cb36578f5" gracePeriod=30 Mar 18 16:00:22 crc kubenswrapper[4696]: I0318 16:00:22.459802 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1" containerName="nova-api-api" containerID="cri-o://c29646643bf825a4cae0479b9418933c08b989807de843cf870b8b77880da0a7" gracePeriod=30 Mar 18 16:00:23 crc kubenswrapper[4696]: I0318 16:00:23.486711 4696 generic.go:334] "Generic (PLEG): container finished" podID="b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1" containerID="3892c73a5179c3262cba4f588ee27ca4b888ae29a3d95e09473f1e5cb36578f5" exitCode=143 Mar 18 16:00:23 crc kubenswrapper[4696]: I0318 16:00:23.487533 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1","Type":"ContainerDied","Data":"3892c73a5179c3262cba4f588ee27ca4b888ae29a3d95e09473f1e5cb36578f5"} Mar 18 16:00:23 crc kubenswrapper[4696]: I0318 16:00:23.495196 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" event={"ID":"a024523d-d753-4a81-a8e5-2b416559d14f","Type":"ContainerStarted","Data":"901019a8a0d3f4547c51fc68c642baf6be7145eca0abb34460bc2d40619fb0b6"} Mar 18 16:00:23 crc kubenswrapper[4696]: I0318 16:00:23.496696 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" Mar 18 16:00:23 crc kubenswrapper[4696]: I0318 16:00:23.525592 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" podStartSLOduration=4.525565133 podStartE2EDuration="4.525565133s" podCreationTimestamp="2026-03-18 16:00:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:23.518912836 +0000 UTC m=+1466.525087042" watchObservedRunningTime="2026-03-18 16:00:23.525565133 +0000 UTC m=+1466.531739339" Mar 18 16:00:23 crc kubenswrapper[4696]: I0318 16:00:23.910329 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564160-sfqqd" Mar 18 16:00:24 crc kubenswrapper[4696]: I0318 16:00:24.039033 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s68jq\" (UniqueName: \"kubernetes.io/projected/74ca80c8-a4c1-4f87-9e78-5648ca013164-kube-api-access-s68jq\") pod \"74ca80c8-a4c1-4f87-9e78-5648ca013164\" (UID: \"74ca80c8-a4c1-4f87-9e78-5648ca013164\") " Mar 18 16:00:24 crc kubenswrapper[4696]: I0318 16:00:24.047588 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74ca80c8-a4c1-4f87-9e78-5648ca013164-kube-api-access-s68jq" (OuterVolumeSpecName: "kube-api-access-s68jq") pod "74ca80c8-a4c1-4f87-9e78-5648ca013164" (UID: "74ca80c8-a4c1-4f87-9e78-5648ca013164"). InnerVolumeSpecName "kube-api-access-s68jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:24 crc kubenswrapper[4696]: I0318 16:00:24.142345 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s68jq\" (UniqueName: \"kubernetes.io/projected/74ca80c8-a4c1-4f87-9e78-5648ca013164-kube-api-access-s68jq\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:24 crc kubenswrapper[4696]: I0318 16:00:24.510475 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564160-sfqqd" Mar 18 16:00:24 crc kubenswrapper[4696]: I0318 16:00:24.512645 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564160-sfqqd" event={"ID":"74ca80c8-a4c1-4f87-9e78-5648ca013164","Type":"ContainerDied","Data":"ada21783a02e0baa5e1d612e57b9e65919427a51454df740f1fe80b63d998053"} Mar 18 16:00:24 crc kubenswrapper[4696]: I0318 16:00:24.512690 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ada21783a02e0baa5e1d612e57b9e65919427a51454df740f1fe80b63d998053" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.034559 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564154-pd7vl"] Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.046153 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564154-pd7vl"] Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.161825 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.271451 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-scripts\") pod \"bdc7971b-ce0d-490b-b046-7be30738505a\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.271645 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-combined-ca-bundle\") pod \"bdc7971b-ce0d-490b-b046-7be30738505a\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.271679 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kq6p\" (UniqueName: \"kubernetes.io/projected/bdc7971b-ce0d-490b-b046-7be30738505a-kube-api-access-9kq6p\") pod \"bdc7971b-ce0d-490b-b046-7be30738505a\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.271873 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdc7971b-ce0d-490b-b046-7be30738505a-run-httpd\") pod \"bdc7971b-ce0d-490b-b046-7be30738505a\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.272002 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdc7971b-ce0d-490b-b046-7be30738505a-log-httpd\") pod \"bdc7971b-ce0d-490b-b046-7be30738505a\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.272043 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-sg-core-conf-yaml\") pod \"bdc7971b-ce0d-490b-b046-7be30738505a\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.272073 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-ceilometer-tls-certs\") pod \"bdc7971b-ce0d-490b-b046-7be30738505a\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.272108 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-config-data\") pod \"bdc7971b-ce0d-490b-b046-7be30738505a\" (UID: \"bdc7971b-ce0d-490b-b046-7be30738505a\") " Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.273329 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdc7971b-ce0d-490b-b046-7be30738505a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bdc7971b-ce0d-490b-b046-7be30738505a" (UID: "bdc7971b-ce0d-490b-b046-7be30738505a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.275197 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdc7971b-ce0d-490b-b046-7be30738505a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bdc7971b-ce0d-490b-b046-7be30738505a" (UID: "bdc7971b-ce0d-490b-b046-7be30738505a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.283770 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdc7971b-ce0d-490b-b046-7be30738505a-kube-api-access-9kq6p" (OuterVolumeSpecName: "kube-api-access-9kq6p") pod "bdc7971b-ce0d-490b-b046-7be30738505a" (UID: "bdc7971b-ce0d-490b-b046-7be30738505a"). InnerVolumeSpecName "kube-api-access-9kq6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.285269 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-scripts" (OuterVolumeSpecName: "scripts") pod "bdc7971b-ce0d-490b-b046-7be30738505a" (UID: "bdc7971b-ce0d-490b-b046-7be30738505a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.309984 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bdc7971b-ce0d-490b-b046-7be30738505a" (UID: "bdc7971b-ce0d-490b-b046-7be30738505a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.348843 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "bdc7971b-ce0d-490b-b046-7be30738505a" (UID: "bdc7971b-ce0d-490b-b046-7be30738505a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.366259 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bdc7971b-ce0d-490b-b046-7be30738505a" (UID: "bdc7971b-ce0d-490b-b046-7be30738505a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.374655 4696 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdc7971b-ce0d-490b-b046-7be30738505a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.374693 4696 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdc7971b-ce0d-490b-b046-7be30738505a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.374709 4696 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.374722 4696 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.374735 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.374746 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.374758 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kq6p\" (UniqueName: \"kubernetes.io/projected/bdc7971b-ce0d-490b-b046-7be30738505a-kube-api-access-9kq6p\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.376294 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.430885 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-config-data" (OuterVolumeSpecName: "config-data") pod "bdc7971b-ce0d-490b-b046-7be30738505a" (UID: "bdc7971b-ce0d-490b-b046-7be30738505a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.476754 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-769jc\" (UniqueName: \"kubernetes.io/projected/008931c6-5d36-4eea-954c-0f583bde3955-kube-api-access-769jc\") pod \"008931c6-5d36-4eea-954c-0f583bde3955\" (UID: \"008931c6-5d36-4eea-954c-0f583bde3955\") " Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.476869 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/008931c6-5d36-4eea-954c-0f583bde3955-config-data\") pod \"008931c6-5d36-4eea-954c-0f583bde3955\" (UID: \"008931c6-5d36-4eea-954c-0f583bde3955\") " Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.477125 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/008931c6-5d36-4eea-954c-0f583bde3955-combined-ca-bundle\") pod \"008931c6-5d36-4eea-954c-0f583bde3955\" (UID: \"008931c6-5d36-4eea-954c-0f583bde3955\") " Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.477753 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdc7971b-ce0d-490b-b046-7be30738505a-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.480672 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/008931c6-5d36-4eea-954c-0f583bde3955-kube-api-access-769jc" (OuterVolumeSpecName: "kube-api-access-769jc") pod "008931c6-5d36-4eea-954c-0f583bde3955" (UID: "008931c6-5d36-4eea-954c-0f583bde3955"). InnerVolumeSpecName "kube-api-access-769jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.503715 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/008931c6-5d36-4eea-954c-0f583bde3955-config-data" (OuterVolumeSpecName: "config-data") pod "008931c6-5d36-4eea-954c-0f583bde3955" (UID: "008931c6-5d36-4eea-954c-0f583bde3955"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.520013 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/008931c6-5d36-4eea-954c-0f583bde3955-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "008931c6-5d36-4eea-954c-0f583bde3955" (UID: "008931c6-5d36-4eea-954c-0f583bde3955"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.521673 4696 generic.go:334] "Generic (PLEG): container finished" podID="008931c6-5d36-4eea-954c-0f583bde3955" containerID="9d712819b001e31abb5b8f3b8ddf668faf125f5c623ae6e243b35577ffe746c7" exitCode=137 Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.521837 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.522592 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"008931c6-5d36-4eea-954c-0f583bde3955","Type":"ContainerDied","Data":"9d712819b001e31abb5b8f3b8ddf668faf125f5c623ae6e243b35577ffe746c7"} Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.522652 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"008931c6-5d36-4eea-954c-0f583bde3955","Type":"ContainerDied","Data":"f4495eb25b493f41d161d31e08dab5613dc262476a48f8a68e2eda0d239f9c46"} Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.522676 4696 scope.go:117] "RemoveContainer" containerID="9d712819b001e31abb5b8f3b8ddf668faf125f5c623ae6e243b35577ffe746c7" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.530931 4696 generic.go:334] "Generic (PLEG): container finished" podID="bdc7971b-ce0d-490b-b046-7be30738505a" containerID="3589742346c6aebfd1e9e14560ed9c396299d74a20ca99b7d428c654123a9b28" exitCode=0 Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.531008 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.531029 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdc7971b-ce0d-490b-b046-7be30738505a","Type":"ContainerDied","Data":"3589742346c6aebfd1e9e14560ed9c396299d74a20ca99b7d428c654123a9b28"} Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.531071 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdc7971b-ce0d-490b-b046-7be30738505a","Type":"ContainerDied","Data":"a070ad6c78e95935a9bb4b277522569ca30cfb54f99509b53bae1c4e00a533fa"} Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.555488 4696 scope.go:117] "RemoveContainer" containerID="9d712819b001e31abb5b8f3b8ddf668faf125f5c623ae6e243b35577ffe746c7" Mar 18 16:00:25 crc kubenswrapper[4696]: E0318 16:00:25.556889 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d712819b001e31abb5b8f3b8ddf668faf125f5c623ae6e243b35577ffe746c7\": container with ID starting with 9d712819b001e31abb5b8f3b8ddf668faf125f5c623ae6e243b35577ffe746c7 not found: ID does not exist" containerID="9d712819b001e31abb5b8f3b8ddf668faf125f5c623ae6e243b35577ffe746c7" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.556954 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d712819b001e31abb5b8f3b8ddf668faf125f5c623ae6e243b35577ffe746c7"} err="failed to get container status \"9d712819b001e31abb5b8f3b8ddf668faf125f5c623ae6e243b35577ffe746c7\": rpc error: code = NotFound desc = could not find container \"9d712819b001e31abb5b8f3b8ddf668faf125f5c623ae6e243b35577ffe746c7\": container with ID starting with 9d712819b001e31abb5b8f3b8ddf668faf125f5c623ae6e243b35577ffe746c7 not found: ID does not exist" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.556991 4696 scope.go:117] "RemoveContainer" containerID="3946083d2400a1e43d9cf4a178ed0f6bdf05b03c04b7eb34290bef7a358a53a1" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.572497 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.579784 4696 scope.go:117] "RemoveContainer" containerID="a01df05cf9c30d3fe815a40c748f360fc726e738754b228dd838e8759b3e72c2" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.579886 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-769jc\" (UniqueName: \"kubernetes.io/projected/008931c6-5d36-4eea-954c-0f583bde3955-kube-api-access-769jc\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.579932 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/008931c6-5d36-4eea-954c-0f583bde3955-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.579948 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/008931c6-5d36-4eea-954c-0f583bde3955-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.593537 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.611031 4696 scope.go:117] "RemoveContainer" containerID="3589742346c6aebfd1e9e14560ed9c396299d74a20ca99b7d428c654123a9b28" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.618104 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdc7971b-ce0d-490b-b046-7be30738505a" path="/var/lib/kubelet/pods/bdc7971b-ce0d-490b-b046-7be30738505a/volumes" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.619020 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd6bdd44-a607-4081-a732-f572001c79af" path="/var/lib/kubelet/pods/dd6bdd44-a607-4081-a732-f572001c79af/volumes" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.621157 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.624255 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.635939 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:25 crc kubenswrapper[4696]: E0318 16:00:25.636584 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="008931c6-5d36-4eea-954c-0f583bde3955" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.636609 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="008931c6-5d36-4eea-954c-0f583bde3955" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 16:00:25 crc kubenswrapper[4696]: E0318 16:00:25.636633 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74ca80c8-a4c1-4f87-9e78-5648ca013164" containerName="oc" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.636643 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="74ca80c8-a4c1-4f87-9e78-5648ca013164" containerName="oc" Mar 18 16:00:25 crc kubenswrapper[4696]: E0318 16:00:25.636681 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc7971b-ce0d-490b-b046-7be30738505a" containerName="ceilometer-central-agent" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.636690 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc7971b-ce0d-490b-b046-7be30738505a" containerName="ceilometer-central-agent" Mar 18 16:00:25 crc kubenswrapper[4696]: E0318 16:00:25.636717 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc7971b-ce0d-490b-b046-7be30738505a" containerName="ceilometer-notification-agent" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.636724 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc7971b-ce0d-490b-b046-7be30738505a" containerName="ceilometer-notification-agent" Mar 18 16:00:25 crc kubenswrapper[4696]: E0318 16:00:25.636735 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc7971b-ce0d-490b-b046-7be30738505a" containerName="proxy-httpd" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.636743 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc7971b-ce0d-490b-b046-7be30738505a" containerName="proxy-httpd" Mar 18 16:00:25 crc kubenswrapper[4696]: E0318 16:00:25.636759 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc7971b-ce0d-490b-b046-7be30738505a" containerName="sg-core" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.636808 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc7971b-ce0d-490b-b046-7be30738505a" containerName="sg-core" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.637036 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="008931c6-5d36-4eea-954c-0f583bde3955" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.637050 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdc7971b-ce0d-490b-b046-7be30738505a" containerName="ceilometer-notification-agent" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.637075 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="74ca80c8-a4c1-4f87-9e78-5648ca013164" containerName="oc" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.637087 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdc7971b-ce0d-490b-b046-7be30738505a" containerName="ceilometer-central-agent" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.637100 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdc7971b-ce0d-490b-b046-7be30738505a" containerName="proxy-httpd" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.637114 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdc7971b-ce0d-490b-b046-7be30738505a" containerName="sg-core" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.653447 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.662749 4696 scope.go:117] "RemoveContainer" containerID="953ae1a9222abd277f53185b46363be0b7b2e7b36397ee31fe0c6140f9e71e53" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.662940 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.665855 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.666085 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.666269 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.685715 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.690965 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.693539 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.693893 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.694514 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.714262 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.729562 4696 scope.go:117] "RemoveContainer" containerID="3946083d2400a1e43d9cf4a178ed0f6bdf05b03c04b7eb34290bef7a358a53a1" Mar 18 16:00:25 crc kubenswrapper[4696]: E0318 16:00:25.730940 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3946083d2400a1e43d9cf4a178ed0f6bdf05b03c04b7eb34290bef7a358a53a1\": container with ID starting with 3946083d2400a1e43d9cf4a178ed0f6bdf05b03c04b7eb34290bef7a358a53a1 not found: ID does not exist" containerID="3946083d2400a1e43d9cf4a178ed0f6bdf05b03c04b7eb34290bef7a358a53a1" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.730978 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3946083d2400a1e43d9cf4a178ed0f6bdf05b03c04b7eb34290bef7a358a53a1"} err="failed to get container status \"3946083d2400a1e43d9cf4a178ed0f6bdf05b03c04b7eb34290bef7a358a53a1\": rpc error: code = NotFound desc = could not find container \"3946083d2400a1e43d9cf4a178ed0f6bdf05b03c04b7eb34290bef7a358a53a1\": container with ID starting with 3946083d2400a1e43d9cf4a178ed0f6bdf05b03c04b7eb34290bef7a358a53a1 not found: ID does not exist" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.731008 4696 scope.go:117] "RemoveContainer" containerID="a01df05cf9c30d3fe815a40c748f360fc726e738754b228dd838e8759b3e72c2" Mar 18 16:00:25 crc kubenswrapper[4696]: E0318 16:00:25.731280 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a01df05cf9c30d3fe815a40c748f360fc726e738754b228dd838e8759b3e72c2\": container with ID starting with a01df05cf9c30d3fe815a40c748f360fc726e738754b228dd838e8759b3e72c2 not found: ID does not exist" containerID="a01df05cf9c30d3fe815a40c748f360fc726e738754b228dd838e8759b3e72c2" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.731305 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a01df05cf9c30d3fe815a40c748f360fc726e738754b228dd838e8759b3e72c2"} err="failed to get container status \"a01df05cf9c30d3fe815a40c748f360fc726e738754b228dd838e8759b3e72c2\": rpc error: code = NotFound desc = could not find container \"a01df05cf9c30d3fe815a40c748f360fc726e738754b228dd838e8759b3e72c2\": container with ID starting with a01df05cf9c30d3fe815a40c748f360fc726e738754b228dd838e8759b3e72c2 not found: ID does not exist" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.731322 4696 scope.go:117] "RemoveContainer" containerID="3589742346c6aebfd1e9e14560ed9c396299d74a20ca99b7d428c654123a9b28" Mar 18 16:00:25 crc kubenswrapper[4696]: E0318 16:00:25.731530 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3589742346c6aebfd1e9e14560ed9c396299d74a20ca99b7d428c654123a9b28\": container with ID starting with 3589742346c6aebfd1e9e14560ed9c396299d74a20ca99b7d428c654123a9b28 not found: ID does not exist" containerID="3589742346c6aebfd1e9e14560ed9c396299d74a20ca99b7d428c654123a9b28" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.731553 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3589742346c6aebfd1e9e14560ed9c396299d74a20ca99b7d428c654123a9b28"} err="failed to get container status \"3589742346c6aebfd1e9e14560ed9c396299d74a20ca99b7d428c654123a9b28\": rpc error: code = NotFound desc = could not find container \"3589742346c6aebfd1e9e14560ed9c396299d74a20ca99b7d428c654123a9b28\": container with ID starting with 3589742346c6aebfd1e9e14560ed9c396299d74a20ca99b7d428c654123a9b28 not found: ID does not exist" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.731571 4696 scope.go:117] "RemoveContainer" containerID="953ae1a9222abd277f53185b46363be0b7b2e7b36397ee31fe0c6140f9e71e53" Mar 18 16:00:25 crc kubenswrapper[4696]: E0318 16:00:25.732533 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"953ae1a9222abd277f53185b46363be0b7b2e7b36397ee31fe0c6140f9e71e53\": container with ID starting with 953ae1a9222abd277f53185b46363be0b7b2e7b36397ee31fe0c6140f9e71e53 not found: ID does not exist" containerID="953ae1a9222abd277f53185b46363be0b7b2e7b36397ee31fe0c6140f9e71e53" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.732560 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"953ae1a9222abd277f53185b46363be0b7b2e7b36397ee31fe0c6140f9e71e53"} err="failed to get container status \"953ae1a9222abd277f53185b46363be0b7b2e7b36397ee31fe0c6140f9e71e53\": rpc error: code = NotFound desc = could not find container \"953ae1a9222abd277f53185b46363be0b7b2e7b36397ee31fe0c6140f9e71e53\": container with ID starting with 953ae1a9222abd277f53185b46363be0b7b2e7b36397ee31fe0c6140f9e71e53 not found: ID does not exist" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.784617 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txfnj\" (UniqueName: \"kubernetes.io/projected/6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3-kube-api-access-txfnj\") pod \"ceilometer-0\" (UID: \"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.784692 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/21cc776b-31bc-469a-9b50-930b0480541d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"21cc776b-31bc-469a-9b50-930b0480541d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.784719 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chgf5\" (UniqueName: \"kubernetes.io/projected/21cc776b-31bc-469a-9b50-930b0480541d-kube-api-access-chgf5\") pod \"nova-cell1-novncproxy-0\" (UID: \"21cc776b-31bc-469a-9b50-930b0480541d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.784751 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3-scripts\") pod \"ceilometer-0\" (UID: \"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.784789 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3-log-httpd\") pod \"ceilometer-0\" (UID: \"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.784811 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3-config-data\") pod \"ceilometer-0\" (UID: \"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.784836 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.784859 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21cc776b-31bc-469a-9b50-930b0480541d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"21cc776b-31bc-469a-9b50-930b0480541d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.784910 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3-run-httpd\") pod \"ceilometer-0\" (UID: \"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.784928 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.784974 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21cc776b-31bc-469a-9b50-930b0480541d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"21cc776b-31bc-469a-9b50-930b0480541d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.785001 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/21cc776b-31bc-469a-9b50-930b0480541d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"21cc776b-31bc-469a-9b50-930b0480541d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.785035 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.886546 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3-scripts\") pod \"ceilometer-0\" (UID: \"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.886605 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3-log-httpd\") pod \"ceilometer-0\" (UID: \"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.886629 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3-config-data\") pod \"ceilometer-0\" (UID: \"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.886660 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.886688 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21cc776b-31bc-469a-9b50-930b0480541d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"21cc776b-31bc-469a-9b50-930b0480541d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.886749 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.886769 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3-run-httpd\") pod \"ceilometer-0\" (UID: \"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.886802 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21cc776b-31bc-469a-9b50-930b0480541d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"21cc776b-31bc-469a-9b50-930b0480541d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.886823 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/21cc776b-31bc-469a-9b50-930b0480541d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"21cc776b-31bc-469a-9b50-930b0480541d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.886851 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.886920 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txfnj\" (UniqueName: \"kubernetes.io/projected/6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3-kube-api-access-txfnj\") pod \"ceilometer-0\" (UID: \"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.886953 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/21cc776b-31bc-469a-9b50-930b0480541d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"21cc776b-31bc-469a-9b50-930b0480541d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.886971 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chgf5\" (UniqueName: \"kubernetes.io/projected/21cc776b-31bc-469a-9b50-930b0480541d-kube-api-access-chgf5\") pod \"nova-cell1-novncproxy-0\" (UID: \"21cc776b-31bc-469a-9b50-930b0480541d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.888717 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3-run-httpd\") pod \"ceilometer-0\" (UID: \"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.888997 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3-log-httpd\") pod \"ceilometer-0\" (UID: \"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.894103 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21cc776b-31bc-469a-9b50-930b0480541d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"21cc776b-31bc-469a-9b50-930b0480541d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.895230 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3-config-data\") pod \"ceilometer-0\" (UID: \"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.899059 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/21cc776b-31bc-469a-9b50-930b0480541d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"21cc776b-31bc-469a-9b50-930b0480541d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.899621 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.899641 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3-scripts\") pod \"ceilometer-0\" (UID: \"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.899892 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.907371 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21cc776b-31bc-469a-9b50-930b0480541d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"21cc776b-31bc-469a-9b50-930b0480541d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.910530 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/21cc776b-31bc-469a-9b50-930b0480541d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"21cc776b-31bc-469a-9b50-930b0480541d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.914781 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txfnj\" (UniqueName: \"kubernetes.io/projected/6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3-kube-api-access-txfnj\") pod \"ceilometer-0\" (UID: \"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.918342 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chgf5\" (UniqueName: \"kubernetes.io/projected/21cc776b-31bc-469a-9b50-930b0480541d-kube-api-access-chgf5\") pod \"nova-cell1-novncproxy-0\" (UID: \"21cc776b-31bc-469a-9b50-930b0480541d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:25 crc kubenswrapper[4696]: I0318 16:00:25.926497 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3\") " pod="openstack/ceilometer-0" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.030143 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.040297 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.174784 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.296342 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1-logs\") pod \"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1\" (UID: \"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1\") " Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.296531 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1-combined-ca-bundle\") pod \"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1\" (UID: \"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1\") " Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.296618 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1-config-data\") pod \"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1\" (UID: \"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1\") " Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.296757 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnlnx\" (UniqueName: \"kubernetes.io/projected/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1-kube-api-access-vnlnx\") pod \"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1\" (UID: \"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1\") " Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.297156 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1-logs" (OuterVolumeSpecName: "logs") pod "b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1" (UID: "b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.297780 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.307865 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1-kube-api-access-vnlnx" (OuterVolumeSpecName: "kube-api-access-vnlnx") pod "b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1" (UID: "b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1"). InnerVolumeSpecName "kube-api-access-vnlnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.330320 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1-config-data" (OuterVolumeSpecName: "config-data") pod "b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1" (UID: "b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.344737 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1" (UID: "b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.400750 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnlnx\" (UniqueName: \"kubernetes.io/projected/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1-kube-api-access-vnlnx\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.400821 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.400835 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.552798 4696 generic.go:334] "Generic (PLEG): container finished" podID="b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1" containerID="c29646643bf825a4cae0479b9418933c08b989807de843cf870b8b77880da0a7" exitCode=0 Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.553139 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1","Type":"ContainerDied","Data":"c29646643bf825a4cae0479b9418933c08b989807de843cf870b8b77880da0a7"} Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.553174 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1","Type":"ContainerDied","Data":"a20933208e1ebbc1dc08353974c6b239fc5ce6a39c2f4af9843fec448a46668a"} Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.553192 4696 scope.go:117] "RemoveContainer" containerID="c29646643bf825a4cae0479b9418933c08b989807de843cf870b8b77880da0a7" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.553338 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.588764 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.605050 4696 scope.go:117] "RemoveContainer" containerID="3892c73a5179c3262cba4f588ee27ca4b888ae29a3d95e09473f1e5cb36578f5" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.612067 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.633256 4696 scope.go:117] "RemoveContainer" containerID="c29646643bf825a4cae0479b9418933c08b989807de843cf870b8b77880da0a7" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.634025 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:00:26 crc kubenswrapper[4696]: E0318 16:00:26.638070 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c29646643bf825a4cae0479b9418933c08b989807de843cf870b8b77880da0a7\": container with ID starting with c29646643bf825a4cae0479b9418933c08b989807de843cf870b8b77880da0a7 not found: ID does not exist" containerID="c29646643bf825a4cae0479b9418933c08b989807de843cf870b8b77880da0a7" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.638113 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c29646643bf825a4cae0479b9418933c08b989807de843cf870b8b77880da0a7"} err="failed to get container status \"c29646643bf825a4cae0479b9418933c08b989807de843cf870b8b77880da0a7\": rpc error: code = NotFound desc = could not find container \"c29646643bf825a4cae0479b9418933c08b989807de843cf870b8b77880da0a7\": container with ID starting with c29646643bf825a4cae0479b9418933c08b989807de843cf870b8b77880da0a7 not found: ID does not exist" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.638141 4696 scope.go:117] "RemoveContainer" containerID="3892c73a5179c3262cba4f588ee27ca4b888ae29a3d95e09473f1e5cb36578f5" Mar 18 16:00:26 crc kubenswrapper[4696]: E0318 16:00:26.638588 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3892c73a5179c3262cba4f588ee27ca4b888ae29a3d95e09473f1e5cb36578f5\": container with ID starting with 3892c73a5179c3262cba4f588ee27ca4b888ae29a3d95e09473f1e5cb36578f5 not found: ID does not exist" containerID="3892c73a5179c3262cba4f588ee27ca4b888ae29a3d95e09473f1e5cb36578f5" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.638658 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3892c73a5179c3262cba4f588ee27ca4b888ae29a3d95e09473f1e5cb36578f5"} err="failed to get container status \"3892c73a5179c3262cba4f588ee27ca4b888ae29a3d95e09473f1e5cb36578f5\": rpc error: code = NotFound desc = could not find container \"3892c73a5179c3262cba4f588ee27ca4b888ae29a3d95e09473f1e5cb36578f5\": container with ID starting with 3892c73a5179c3262cba4f588ee27ca4b888ae29a3d95e09473f1e5cb36578f5 not found: ID does not exist" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.644612 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 16:00:26 crc kubenswrapper[4696]: E0318 16:00:26.645323 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1" containerName="nova-api-api" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.645343 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1" containerName="nova-api-api" Mar 18 16:00:26 crc kubenswrapper[4696]: E0318 16:00:26.645363 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1" containerName="nova-api-log" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.645373 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1" containerName="nova-api-log" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.645663 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1" containerName="nova-api-log" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.645696 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1" containerName="nova-api-api" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.647122 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.653080 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.654641 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.654829 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.656177 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.698891 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.811385 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/974761ea-3578-41b3-8e25-6232883d5b4c-config-data\") pod \"nova-api-0\" (UID: \"974761ea-3578-41b3-8e25-6232883d5b4c\") " pod="openstack/nova-api-0" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.811770 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/974761ea-3578-41b3-8e25-6232883d5b4c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"974761ea-3578-41b3-8e25-6232883d5b4c\") " pod="openstack/nova-api-0" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.811813 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/974761ea-3578-41b3-8e25-6232883d5b4c-public-tls-certs\") pod \"nova-api-0\" (UID: \"974761ea-3578-41b3-8e25-6232883d5b4c\") " pod="openstack/nova-api-0" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.811925 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/974761ea-3578-41b3-8e25-6232883d5b4c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"974761ea-3578-41b3-8e25-6232883d5b4c\") " pod="openstack/nova-api-0" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.812003 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq7jm\" (UniqueName: \"kubernetes.io/projected/974761ea-3578-41b3-8e25-6232883d5b4c-kube-api-access-zq7jm\") pod \"nova-api-0\" (UID: \"974761ea-3578-41b3-8e25-6232883d5b4c\") " pod="openstack/nova-api-0" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.812045 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/974761ea-3578-41b3-8e25-6232883d5b4c-logs\") pod \"nova-api-0\" (UID: \"974761ea-3578-41b3-8e25-6232883d5b4c\") " pod="openstack/nova-api-0" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.914395 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/974761ea-3578-41b3-8e25-6232883d5b4c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"974761ea-3578-41b3-8e25-6232883d5b4c\") " pod="openstack/nova-api-0" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.914461 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq7jm\" (UniqueName: \"kubernetes.io/projected/974761ea-3578-41b3-8e25-6232883d5b4c-kube-api-access-zq7jm\") pod \"nova-api-0\" (UID: \"974761ea-3578-41b3-8e25-6232883d5b4c\") " pod="openstack/nova-api-0" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.914493 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/974761ea-3578-41b3-8e25-6232883d5b4c-logs\") pod \"nova-api-0\" (UID: \"974761ea-3578-41b3-8e25-6232883d5b4c\") " pod="openstack/nova-api-0" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.914634 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/974761ea-3578-41b3-8e25-6232883d5b4c-config-data\") pod \"nova-api-0\" (UID: \"974761ea-3578-41b3-8e25-6232883d5b4c\") " pod="openstack/nova-api-0" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.914704 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/974761ea-3578-41b3-8e25-6232883d5b4c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"974761ea-3578-41b3-8e25-6232883d5b4c\") " pod="openstack/nova-api-0" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.914745 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/974761ea-3578-41b3-8e25-6232883d5b4c-public-tls-certs\") pod \"nova-api-0\" (UID: \"974761ea-3578-41b3-8e25-6232883d5b4c\") " pod="openstack/nova-api-0" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.915279 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/974761ea-3578-41b3-8e25-6232883d5b4c-logs\") pod \"nova-api-0\" (UID: \"974761ea-3578-41b3-8e25-6232883d5b4c\") " pod="openstack/nova-api-0" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.920197 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/974761ea-3578-41b3-8e25-6232883d5b4c-public-tls-certs\") pod \"nova-api-0\" (UID: \"974761ea-3578-41b3-8e25-6232883d5b4c\") " pod="openstack/nova-api-0" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.933096 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/974761ea-3578-41b3-8e25-6232883d5b4c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"974761ea-3578-41b3-8e25-6232883d5b4c\") " pod="openstack/nova-api-0" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.940728 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/974761ea-3578-41b3-8e25-6232883d5b4c-config-data\") pod \"nova-api-0\" (UID: \"974761ea-3578-41b3-8e25-6232883d5b4c\") " pod="openstack/nova-api-0" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.941198 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/974761ea-3578-41b3-8e25-6232883d5b4c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"974761ea-3578-41b3-8e25-6232883d5b4c\") " pod="openstack/nova-api-0" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.961827 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq7jm\" (UniqueName: \"kubernetes.io/projected/974761ea-3578-41b3-8e25-6232883d5b4c-kube-api-access-zq7jm\") pod \"nova-api-0\" (UID: \"974761ea-3578-41b3-8e25-6232883d5b4c\") " pod="openstack/nova-api-0" Mar 18 16:00:26 crc kubenswrapper[4696]: I0318 16:00:26.983292 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:00:27 crc kubenswrapper[4696]: I0318 16:00:27.485556 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:00:27 crc kubenswrapper[4696]: W0318 16:00:27.494378 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod974761ea_3578_41b3_8e25_6232883d5b4c.slice/crio-88386fa8d5357d991c40a6bfcfffb2da44ff5836c8dec4bc30d505bc4d8fdeb9 WatchSource:0}: Error finding container 88386fa8d5357d991c40a6bfcfffb2da44ff5836c8dec4bc30d505bc4d8fdeb9: Status 404 returned error can't find the container with id 88386fa8d5357d991c40a6bfcfffb2da44ff5836c8dec4bc30d505bc4d8fdeb9 Mar 18 16:00:27 crc kubenswrapper[4696]: I0318 16:00:27.568324 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3","Type":"ContainerStarted","Data":"f0cd19ad57a3a71be799243de281f35280cbe0c003db0789a3f21c3cc4297468"} Mar 18 16:00:27 crc kubenswrapper[4696]: I0318 16:00:27.577588 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"974761ea-3578-41b3-8e25-6232883d5b4c","Type":"ContainerStarted","Data":"88386fa8d5357d991c40a6bfcfffb2da44ff5836c8dec4bc30d505bc4d8fdeb9"} Mar 18 16:00:27 crc kubenswrapper[4696]: I0318 16:00:27.582546 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"21cc776b-31bc-469a-9b50-930b0480541d","Type":"ContainerStarted","Data":"aa5d66c4f2843a295fa4fd456b824c70ebba8d4a7d185313abcf695b62c59920"} Mar 18 16:00:27 crc kubenswrapper[4696]: I0318 16:00:27.582609 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"21cc776b-31bc-469a-9b50-930b0480541d","Type":"ContainerStarted","Data":"d4c9cfcba1c2b3967ee778bedb063fc1dd0d42de0f2460eb36b66e840308de02"} Mar 18 16:00:27 crc kubenswrapper[4696]: I0318 16:00:27.617303 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.617278332 podStartE2EDuration="2.617278332s" podCreationTimestamp="2026-03-18 16:00:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:27.600267324 +0000 UTC m=+1470.606441530" watchObservedRunningTime="2026-03-18 16:00:27.617278332 +0000 UTC m=+1470.623452538" Mar 18 16:00:27 crc kubenswrapper[4696]: I0318 16:00:27.634381 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="008931c6-5d36-4eea-954c-0f583bde3955" path="/var/lib/kubelet/pods/008931c6-5d36-4eea-954c-0f583bde3955/volumes" Mar 18 16:00:27 crc kubenswrapper[4696]: I0318 16:00:27.635016 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1" path="/var/lib/kubelet/pods/b2567e0b-0b3f-4a3a-8b94-f42c8b275ab1/volumes" Mar 18 16:00:28 crc kubenswrapper[4696]: I0318 16:00:28.596288 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"974761ea-3578-41b3-8e25-6232883d5b4c","Type":"ContainerStarted","Data":"5804bcf171d5dbd8e8eecb1471954055af1bd27a682572f6b10201ef590edeaf"} Mar 18 16:00:28 crc kubenswrapper[4696]: I0318 16:00:28.596781 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"974761ea-3578-41b3-8e25-6232883d5b4c","Type":"ContainerStarted","Data":"2da7af7386e4a229234247a7f16015eb668a9838b92490bc6e373fc51982d5c3"} Mar 18 16:00:28 crc kubenswrapper[4696]: I0318 16:00:28.599313 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3","Type":"ContainerStarted","Data":"f5c660ff6194780ed5480f701cc0d81fbb28a7ba2ca87de93cc27b59741e85af"} Mar 18 16:00:28 crc kubenswrapper[4696]: I0318 16:00:28.645932 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.645904123 podStartE2EDuration="2.645904123s" podCreationTimestamp="2026-03-18 16:00:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:28.621726264 +0000 UTC m=+1471.627900480" watchObservedRunningTime="2026-03-18 16:00:28.645904123 +0000 UTC m=+1471.652078319" Mar 18 16:00:29 crc kubenswrapper[4696]: I0318 16:00:29.620589 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3","Type":"ContainerStarted","Data":"435a87b270244220d9d5f612fb2805b419dfe15cd77406e42056b6b83bb05922"} Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.017816 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.124269 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-btct4"] Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.125248 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-btct4" podUID="04c5761b-8f82-46b1-903c-49c1145516a0" containerName="dnsmasq-dns" containerID="cri-o://a31eb740c98c11bd1dbbe21b974e9398871d2793161b81fef1ff3d52d6c553f9" gracePeriod=10 Mar 18 16:00:30 crc kubenswrapper[4696]: E0318 16:00:30.227838 4696 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04c5761b_8f82_46b1_903c_49c1145516a0.slice/crio-a31eb740c98c11bd1dbbe21b974e9398871d2793161b81fef1ff3d52d6c553f9.scope\": RecentStats: unable to find data in memory cache]" Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.666809 4696 generic.go:334] "Generic (PLEG): container finished" podID="04c5761b-8f82-46b1-903c-49c1145516a0" containerID="a31eb740c98c11bd1dbbe21b974e9398871d2793161b81fef1ff3d52d6c553f9" exitCode=0 Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.666877 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-btct4" event={"ID":"04c5761b-8f82-46b1-903c-49c1145516a0","Type":"ContainerDied","Data":"a31eb740c98c11bd1dbbe21b974e9398871d2793161b81fef1ff3d52d6c553f9"} Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.666908 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-btct4" event={"ID":"04c5761b-8f82-46b1-903c-49c1145516a0","Type":"ContainerDied","Data":"2680be57c4c9f71ff8799d3427d714811eb46af4ed6889b1254f81e35c6d123e"} Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.666920 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2680be57c4c9f71ff8799d3427d714811eb46af4ed6889b1254f81e35c6d123e" Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.668495 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-btct4" Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.671389 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3","Type":"ContainerStarted","Data":"ae82f6618216d107ba3978a4c06d6467dbc26ca003e55006860e9ec8e68e7c00"} Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.740568 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-dns-swift-storage-0\") pod \"04c5761b-8f82-46b1-903c-49c1145516a0\" (UID: \"04c5761b-8f82-46b1-903c-49c1145516a0\") " Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.740755 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drsrd\" (UniqueName: \"kubernetes.io/projected/04c5761b-8f82-46b1-903c-49c1145516a0-kube-api-access-drsrd\") pod \"04c5761b-8f82-46b1-903c-49c1145516a0\" (UID: \"04c5761b-8f82-46b1-903c-49c1145516a0\") " Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.740802 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-ovsdbserver-nb\") pod \"04c5761b-8f82-46b1-903c-49c1145516a0\" (UID: \"04c5761b-8f82-46b1-903c-49c1145516a0\") " Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.740908 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-dns-svc\") pod \"04c5761b-8f82-46b1-903c-49c1145516a0\" (UID: \"04c5761b-8f82-46b1-903c-49c1145516a0\") " Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.740977 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-ovsdbserver-sb\") pod \"04c5761b-8f82-46b1-903c-49c1145516a0\" (UID: \"04c5761b-8f82-46b1-903c-49c1145516a0\") " Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.741073 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-config\") pod \"04c5761b-8f82-46b1-903c-49c1145516a0\" (UID: \"04c5761b-8f82-46b1-903c-49c1145516a0\") " Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.756782 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04c5761b-8f82-46b1-903c-49c1145516a0-kube-api-access-drsrd" (OuterVolumeSpecName: "kube-api-access-drsrd") pod "04c5761b-8f82-46b1-903c-49c1145516a0" (UID: "04c5761b-8f82-46b1-903c-49c1145516a0"). InnerVolumeSpecName "kube-api-access-drsrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.808467 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "04c5761b-8f82-46b1-903c-49c1145516a0" (UID: "04c5761b-8f82-46b1-903c-49c1145516a0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.821184 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "04c5761b-8f82-46b1-903c-49c1145516a0" (UID: "04c5761b-8f82-46b1-903c-49c1145516a0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.821247 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-config" (OuterVolumeSpecName: "config") pod "04c5761b-8f82-46b1-903c-49c1145516a0" (UID: "04c5761b-8f82-46b1-903c-49c1145516a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.824944 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "04c5761b-8f82-46b1-903c-49c1145516a0" (UID: "04c5761b-8f82-46b1-903c-49c1145516a0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.828904 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "04c5761b-8f82-46b1-903c-49c1145516a0" (UID: "04c5761b-8f82-46b1-903c-49c1145516a0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.845305 4696 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.845346 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drsrd\" (UniqueName: \"kubernetes.io/projected/04c5761b-8f82-46b1-903c-49c1145516a0-kube-api-access-drsrd\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.845364 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.845383 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.845395 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:30 crc kubenswrapper[4696]: I0318 16:00:30.845407 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04c5761b-8f82-46b1-903c-49c1145516a0-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:31 crc kubenswrapper[4696]: I0318 16:00:31.040568 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:31 crc kubenswrapper[4696]: I0318 16:00:31.687953 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-btct4" Mar 18 16:00:31 crc kubenswrapper[4696]: I0318 16:00:31.735002 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-btct4"] Mar 18 16:00:31 crc kubenswrapper[4696]: I0318 16:00:31.746381 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-btct4"] Mar 18 16:00:32 crc kubenswrapper[4696]: I0318 16:00:32.700468 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3","Type":"ContainerStarted","Data":"1d191e795163029e0b5e5309bf2ea17c2bd20c954c2e0f992d239eb587e2abde"} Mar 18 16:00:32 crc kubenswrapper[4696]: I0318 16:00:32.700926 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 16:00:32 crc kubenswrapper[4696]: I0318 16:00:32.726845 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.576580746 podStartE2EDuration="7.726821699s" podCreationTimestamp="2026-03-18 16:00:25 +0000 UTC" firstStartedPulling="2026-03-18 16:00:26.633401408 +0000 UTC m=+1469.639575614" lastFinishedPulling="2026-03-18 16:00:31.783642361 +0000 UTC m=+1474.789816567" observedRunningTime="2026-03-18 16:00:32.723859095 +0000 UTC m=+1475.730033301" watchObservedRunningTime="2026-03-18 16:00:32.726821699 +0000 UTC m=+1475.732995905" Mar 18 16:00:33 crc kubenswrapper[4696]: I0318 16:00:33.615763 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04c5761b-8f82-46b1-903c-49c1145516a0" path="/var/lib/kubelet/pods/04c5761b-8f82-46b1-903c-49c1145516a0/volumes" Mar 18 16:00:36 crc kubenswrapper[4696]: I0318 16:00:36.040586 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:36 crc kubenswrapper[4696]: I0318 16:00:36.068225 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:36 crc kubenswrapper[4696]: I0318 16:00:36.759775 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:36 crc kubenswrapper[4696]: I0318 16:00:36.927077 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-vtbst"] Mar 18 16:00:36 crc kubenswrapper[4696]: E0318 16:00:36.927598 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04c5761b-8f82-46b1-903c-49c1145516a0" containerName="dnsmasq-dns" Mar 18 16:00:36 crc kubenswrapper[4696]: I0318 16:00:36.927620 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="04c5761b-8f82-46b1-903c-49c1145516a0" containerName="dnsmasq-dns" Mar 18 16:00:36 crc kubenswrapper[4696]: E0318 16:00:36.927634 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04c5761b-8f82-46b1-903c-49c1145516a0" containerName="init" Mar 18 16:00:36 crc kubenswrapper[4696]: I0318 16:00:36.927640 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="04c5761b-8f82-46b1-903c-49c1145516a0" containerName="init" Mar 18 16:00:36 crc kubenswrapper[4696]: I0318 16:00:36.927844 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="04c5761b-8f82-46b1-903c-49c1145516a0" containerName="dnsmasq-dns" Mar 18 16:00:36 crc kubenswrapper[4696]: I0318 16:00:36.928623 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vtbst" Mar 18 16:00:36 crc kubenswrapper[4696]: I0318 16:00:36.931171 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 18 16:00:36 crc kubenswrapper[4696]: I0318 16:00:36.931604 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 18 16:00:36 crc kubenswrapper[4696]: I0318 16:00:36.939409 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vtbst"] Mar 18 16:00:36 crc kubenswrapper[4696]: I0318 16:00:36.984247 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 16:00:36 crc kubenswrapper[4696]: I0318 16:00:36.984307 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 16:00:37 crc kubenswrapper[4696]: I0318 16:00:37.113359 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5shrq\" (UniqueName: \"kubernetes.io/projected/99d77d81-5c38-41dd-818b-103a8059e1f9-kube-api-access-5shrq\") pod \"nova-cell1-cell-mapping-vtbst\" (UID: \"99d77d81-5c38-41dd-818b-103a8059e1f9\") " pod="openstack/nova-cell1-cell-mapping-vtbst" Mar 18 16:00:37 crc kubenswrapper[4696]: I0318 16:00:37.113693 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d77d81-5c38-41dd-818b-103a8059e1f9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vtbst\" (UID: \"99d77d81-5c38-41dd-818b-103a8059e1f9\") " pod="openstack/nova-cell1-cell-mapping-vtbst" Mar 18 16:00:37 crc kubenswrapper[4696]: I0318 16:00:37.113776 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99d77d81-5c38-41dd-818b-103a8059e1f9-scripts\") pod \"nova-cell1-cell-mapping-vtbst\" (UID: \"99d77d81-5c38-41dd-818b-103a8059e1f9\") " pod="openstack/nova-cell1-cell-mapping-vtbst" Mar 18 16:00:37 crc kubenswrapper[4696]: I0318 16:00:37.114342 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99d77d81-5c38-41dd-818b-103a8059e1f9-config-data\") pod \"nova-cell1-cell-mapping-vtbst\" (UID: \"99d77d81-5c38-41dd-818b-103a8059e1f9\") " pod="openstack/nova-cell1-cell-mapping-vtbst" Mar 18 16:00:37 crc kubenswrapper[4696]: I0318 16:00:37.216628 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99d77d81-5c38-41dd-818b-103a8059e1f9-config-data\") pod \"nova-cell1-cell-mapping-vtbst\" (UID: \"99d77d81-5c38-41dd-818b-103a8059e1f9\") " pod="openstack/nova-cell1-cell-mapping-vtbst" Mar 18 16:00:37 crc kubenswrapper[4696]: I0318 16:00:37.216726 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5shrq\" (UniqueName: \"kubernetes.io/projected/99d77d81-5c38-41dd-818b-103a8059e1f9-kube-api-access-5shrq\") pod \"nova-cell1-cell-mapping-vtbst\" (UID: \"99d77d81-5c38-41dd-818b-103a8059e1f9\") " pod="openstack/nova-cell1-cell-mapping-vtbst" Mar 18 16:00:37 crc kubenswrapper[4696]: I0318 16:00:37.216845 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d77d81-5c38-41dd-818b-103a8059e1f9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vtbst\" (UID: \"99d77d81-5c38-41dd-818b-103a8059e1f9\") " pod="openstack/nova-cell1-cell-mapping-vtbst" Mar 18 16:00:37 crc kubenswrapper[4696]: I0318 16:00:37.216865 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99d77d81-5c38-41dd-818b-103a8059e1f9-scripts\") pod \"nova-cell1-cell-mapping-vtbst\" (UID: \"99d77d81-5c38-41dd-818b-103a8059e1f9\") " pod="openstack/nova-cell1-cell-mapping-vtbst" Mar 18 16:00:37 crc kubenswrapper[4696]: I0318 16:00:37.234430 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d77d81-5c38-41dd-818b-103a8059e1f9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vtbst\" (UID: \"99d77d81-5c38-41dd-818b-103a8059e1f9\") " pod="openstack/nova-cell1-cell-mapping-vtbst" Mar 18 16:00:37 crc kubenswrapper[4696]: I0318 16:00:37.234550 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99d77d81-5c38-41dd-818b-103a8059e1f9-scripts\") pod \"nova-cell1-cell-mapping-vtbst\" (UID: \"99d77d81-5c38-41dd-818b-103a8059e1f9\") " pod="openstack/nova-cell1-cell-mapping-vtbst" Mar 18 16:00:37 crc kubenswrapper[4696]: I0318 16:00:37.235442 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99d77d81-5c38-41dd-818b-103a8059e1f9-config-data\") pod \"nova-cell1-cell-mapping-vtbst\" (UID: \"99d77d81-5c38-41dd-818b-103a8059e1f9\") " pod="openstack/nova-cell1-cell-mapping-vtbst" Mar 18 16:00:37 crc kubenswrapper[4696]: I0318 16:00:37.246091 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5shrq\" (UniqueName: \"kubernetes.io/projected/99d77d81-5c38-41dd-818b-103a8059e1f9-kube-api-access-5shrq\") pod \"nova-cell1-cell-mapping-vtbst\" (UID: \"99d77d81-5c38-41dd-818b-103a8059e1f9\") " pod="openstack/nova-cell1-cell-mapping-vtbst" Mar 18 16:00:37 crc kubenswrapper[4696]: I0318 16:00:37.256075 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vtbst" Mar 18 16:00:37 crc kubenswrapper[4696]: I0318 16:00:37.734317 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vtbst"] Mar 18 16:00:37 crc kubenswrapper[4696]: I0318 16:00:37.752125 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vtbst" event={"ID":"99d77d81-5c38-41dd-818b-103a8059e1f9","Type":"ContainerStarted","Data":"1b368d00af780b15bec588cf69ae1c6284abb0c0be6b76d483628a532fa12685"} Mar 18 16:00:38 crc kubenswrapper[4696]: I0318 16:00:38.005771 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="974761ea-3578-41b3-8e25-6232883d5b4c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:00:38 crc kubenswrapper[4696]: I0318 16:00:38.005796 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="974761ea-3578-41b3-8e25-6232883d5b4c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:00:38 crc kubenswrapper[4696]: I0318 16:00:38.767950 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vtbst" event={"ID":"99d77d81-5c38-41dd-818b-103a8059e1f9","Type":"ContainerStarted","Data":"5f7dd1236fe8ba1ffb40e9bd23f1c5b0a1957e06c474823e571e1acc33178c70"} Mar 18 16:00:38 crc kubenswrapper[4696]: I0318 16:00:38.796579 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-vtbst" podStartSLOduration=2.796555765 podStartE2EDuration="2.796555765s" podCreationTimestamp="2026-03-18 16:00:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:38.790199085 +0000 UTC m=+1481.796373311" watchObservedRunningTime="2026-03-18 16:00:38.796555765 +0000 UTC m=+1481.802729991" Mar 18 16:00:43 crc kubenswrapper[4696]: I0318 16:00:43.833990 4696 generic.go:334] "Generic (PLEG): container finished" podID="99d77d81-5c38-41dd-818b-103a8059e1f9" containerID="5f7dd1236fe8ba1ffb40e9bd23f1c5b0a1957e06c474823e571e1acc33178c70" exitCode=0 Mar 18 16:00:43 crc kubenswrapper[4696]: I0318 16:00:43.834076 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vtbst" event={"ID":"99d77d81-5c38-41dd-818b-103a8059e1f9","Type":"ContainerDied","Data":"5f7dd1236fe8ba1ffb40e9bd23f1c5b0a1957e06c474823e571e1acc33178c70"} Mar 18 16:00:44 crc kubenswrapper[4696]: I0318 16:00:44.984082 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 16:00:44 crc kubenswrapper[4696]: I0318 16:00:44.984399 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 16:00:45 crc kubenswrapper[4696]: I0318 16:00:45.259120 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vtbst" Mar 18 16:00:45 crc kubenswrapper[4696]: I0318 16:00:45.299141 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d77d81-5c38-41dd-818b-103a8059e1f9-combined-ca-bundle\") pod \"99d77d81-5c38-41dd-818b-103a8059e1f9\" (UID: \"99d77d81-5c38-41dd-818b-103a8059e1f9\") " Mar 18 16:00:45 crc kubenswrapper[4696]: I0318 16:00:45.299228 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99d77d81-5c38-41dd-818b-103a8059e1f9-config-data\") pod \"99d77d81-5c38-41dd-818b-103a8059e1f9\" (UID: \"99d77d81-5c38-41dd-818b-103a8059e1f9\") " Mar 18 16:00:45 crc kubenswrapper[4696]: I0318 16:00:45.299377 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5shrq\" (UniqueName: \"kubernetes.io/projected/99d77d81-5c38-41dd-818b-103a8059e1f9-kube-api-access-5shrq\") pod \"99d77d81-5c38-41dd-818b-103a8059e1f9\" (UID: \"99d77d81-5c38-41dd-818b-103a8059e1f9\") " Mar 18 16:00:45 crc kubenswrapper[4696]: I0318 16:00:45.299442 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99d77d81-5c38-41dd-818b-103a8059e1f9-scripts\") pod \"99d77d81-5c38-41dd-818b-103a8059e1f9\" (UID: \"99d77d81-5c38-41dd-818b-103a8059e1f9\") " Mar 18 16:00:45 crc kubenswrapper[4696]: I0318 16:00:45.305649 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99d77d81-5c38-41dd-818b-103a8059e1f9-kube-api-access-5shrq" (OuterVolumeSpecName: "kube-api-access-5shrq") pod "99d77d81-5c38-41dd-818b-103a8059e1f9" (UID: "99d77d81-5c38-41dd-818b-103a8059e1f9"). InnerVolumeSpecName "kube-api-access-5shrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:45 crc kubenswrapper[4696]: I0318 16:00:45.306747 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99d77d81-5c38-41dd-818b-103a8059e1f9-scripts" (OuterVolumeSpecName: "scripts") pod "99d77d81-5c38-41dd-818b-103a8059e1f9" (UID: "99d77d81-5c38-41dd-818b-103a8059e1f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:45 crc kubenswrapper[4696]: E0318 16:00:45.331363 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99d77d81-5c38-41dd-818b-103a8059e1f9-combined-ca-bundle podName:99d77d81-5c38-41dd-818b-103a8059e1f9 nodeName:}" failed. No retries permitted until 2026-03-18 16:00:45.831327639 +0000 UTC m=+1488.837501845 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/99d77d81-5c38-41dd-818b-103a8059e1f9-combined-ca-bundle") pod "99d77d81-5c38-41dd-818b-103a8059e1f9" (UID: "99d77d81-5c38-41dd-818b-103a8059e1f9") : error deleting /var/lib/kubelet/pods/99d77d81-5c38-41dd-818b-103a8059e1f9/volume-subpaths: remove /var/lib/kubelet/pods/99d77d81-5c38-41dd-818b-103a8059e1f9/volume-subpaths: no such file or directory Mar 18 16:00:45 crc kubenswrapper[4696]: I0318 16:00:45.334779 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99d77d81-5c38-41dd-818b-103a8059e1f9-config-data" (OuterVolumeSpecName: "config-data") pod "99d77d81-5c38-41dd-818b-103a8059e1f9" (UID: "99d77d81-5c38-41dd-818b-103a8059e1f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:45 crc kubenswrapper[4696]: I0318 16:00:45.402319 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99d77d81-5c38-41dd-818b-103a8059e1f9-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:45 crc kubenswrapper[4696]: I0318 16:00:45.402377 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99d77d81-5c38-41dd-818b-103a8059e1f9-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:45 crc kubenswrapper[4696]: I0318 16:00:45.402394 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5shrq\" (UniqueName: \"kubernetes.io/projected/99d77d81-5c38-41dd-818b-103a8059e1f9-kube-api-access-5shrq\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:45 crc kubenswrapper[4696]: I0318 16:00:45.857333 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vtbst" event={"ID":"99d77d81-5c38-41dd-818b-103a8059e1f9","Type":"ContainerDied","Data":"1b368d00af780b15bec588cf69ae1c6284abb0c0be6b76d483628a532fa12685"} Mar 18 16:00:45 crc kubenswrapper[4696]: I0318 16:00:45.857380 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b368d00af780b15bec588cf69ae1c6284abb0c0be6b76d483628a532fa12685" Mar 18 16:00:45 crc kubenswrapper[4696]: I0318 16:00:45.857622 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vtbst" Mar 18 16:00:45 crc kubenswrapper[4696]: I0318 16:00:45.911786 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d77d81-5c38-41dd-818b-103a8059e1f9-combined-ca-bundle\") pod \"99d77d81-5c38-41dd-818b-103a8059e1f9\" (UID: \"99d77d81-5c38-41dd-818b-103a8059e1f9\") " Mar 18 16:00:45 crc kubenswrapper[4696]: I0318 16:00:45.916320 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99d77d81-5c38-41dd-818b-103a8059e1f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99d77d81-5c38-41dd-818b-103a8059e1f9" (UID: "99d77d81-5c38-41dd-818b-103a8059e1f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:46 crc kubenswrapper[4696]: I0318 16:00:46.015818 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d77d81-5c38-41dd-818b-103a8059e1f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:46 crc kubenswrapper[4696]: I0318 16:00:46.062602 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:00:46 crc kubenswrapper[4696]: I0318 16:00:46.063025 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="974761ea-3578-41b3-8e25-6232883d5b4c" containerName="nova-api-log" containerID="cri-o://2da7af7386e4a229234247a7f16015eb668a9838b92490bc6e373fc51982d5c3" gracePeriod=30 Mar 18 16:00:46 crc kubenswrapper[4696]: I0318 16:00:46.063159 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="974761ea-3578-41b3-8e25-6232883d5b4c" containerName="nova-api-api" containerID="cri-o://5804bcf171d5dbd8e8eecb1471954055af1bd27a682572f6b10201ef590edeaf" gracePeriod=30 Mar 18 16:00:46 crc kubenswrapper[4696]: I0318 16:00:46.071676 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:00:46 crc kubenswrapper[4696]: I0318 16:00:46.071964 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="30a10886-71b7-48b8-af79-859f4c134f28" containerName="nova-scheduler-scheduler" containerID="cri-o://4c2f74c9956957e3d1950d932caab7d355c51e0b051c1e1863670372e3daf7c0" gracePeriod=30 Mar 18 16:00:46 crc kubenswrapper[4696]: I0318 16:00:46.108098 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:00:46 crc kubenswrapper[4696]: I0318 16:00:46.109029 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="899e251c-2b43-4c26-a918-82e6444d9d21" containerName="nova-metadata-log" containerID="cri-o://583e2d8265ba730d41af1afcd90ca447224b63958b302741a9bfe84f3e92cd93" gracePeriod=30 Mar 18 16:00:46 crc kubenswrapper[4696]: I0318 16:00:46.109171 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="899e251c-2b43-4c26-a918-82e6444d9d21" containerName="nova-metadata-metadata" containerID="cri-o://9283a323074cee9e3e1d05138f65030d145f0263eccdf15120558340835a18b2" gracePeriod=30 Mar 18 16:00:46 crc kubenswrapper[4696]: I0318 16:00:46.872705 4696 generic.go:334] "Generic (PLEG): container finished" podID="899e251c-2b43-4c26-a918-82e6444d9d21" containerID="583e2d8265ba730d41af1afcd90ca447224b63958b302741a9bfe84f3e92cd93" exitCode=143 Mar 18 16:00:46 crc kubenswrapper[4696]: I0318 16:00:46.873090 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"899e251c-2b43-4c26-a918-82e6444d9d21","Type":"ContainerDied","Data":"583e2d8265ba730d41af1afcd90ca447224b63958b302741a9bfe84f3e92cd93"} Mar 18 16:00:46 crc kubenswrapper[4696]: I0318 16:00:46.875334 4696 generic.go:334] "Generic (PLEG): container finished" podID="974761ea-3578-41b3-8e25-6232883d5b4c" containerID="2da7af7386e4a229234247a7f16015eb668a9838b92490bc6e373fc51982d5c3" exitCode=143 Mar 18 16:00:46 crc kubenswrapper[4696]: I0318 16:00:46.875370 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"974761ea-3578-41b3-8e25-6232883d5b4c","Type":"ContainerDied","Data":"2da7af7386e4a229234247a7f16015eb668a9838b92490bc6e373fc51982d5c3"} Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.819108 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.896986 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/899e251c-2b43-4c26-a918-82e6444d9d21-combined-ca-bundle\") pod \"899e251c-2b43-4c26-a918-82e6444d9d21\" (UID: \"899e251c-2b43-4c26-a918-82e6444d9d21\") " Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.897113 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/899e251c-2b43-4c26-a918-82e6444d9d21-nova-metadata-tls-certs\") pod \"899e251c-2b43-4c26-a918-82e6444d9d21\" (UID: \"899e251c-2b43-4c26-a918-82e6444d9d21\") " Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.897330 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x77vs\" (UniqueName: \"kubernetes.io/projected/899e251c-2b43-4c26-a918-82e6444d9d21-kube-api-access-x77vs\") pod \"899e251c-2b43-4c26-a918-82e6444d9d21\" (UID: \"899e251c-2b43-4c26-a918-82e6444d9d21\") " Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.897366 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/899e251c-2b43-4c26-a918-82e6444d9d21-logs\") pod \"899e251c-2b43-4c26-a918-82e6444d9d21\" (UID: \"899e251c-2b43-4c26-a918-82e6444d9d21\") " Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.897490 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/899e251c-2b43-4c26-a918-82e6444d9d21-config-data\") pod \"899e251c-2b43-4c26-a918-82e6444d9d21\" (UID: \"899e251c-2b43-4c26-a918-82e6444d9d21\") " Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.898281 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/899e251c-2b43-4c26-a918-82e6444d9d21-logs" (OuterVolumeSpecName: "logs") pod "899e251c-2b43-4c26-a918-82e6444d9d21" (UID: "899e251c-2b43-4c26-a918-82e6444d9d21"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.904091 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/899e251c-2b43-4c26-a918-82e6444d9d21-kube-api-access-x77vs" (OuterVolumeSpecName: "kube-api-access-x77vs") pod "899e251c-2b43-4c26-a918-82e6444d9d21" (UID: "899e251c-2b43-4c26-a918-82e6444d9d21"). InnerVolumeSpecName "kube-api-access-x77vs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.930211 4696 generic.go:334] "Generic (PLEG): container finished" podID="899e251c-2b43-4c26-a918-82e6444d9d21" containerID="9283a323074cee9e3e1d05138f65030d145f0263eccdf15120558340835a18b2" exitCode=0 Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.930301 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.930321 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"899e251c-2b43-4c26-a918-82e6444d9d21","Type":"ContainerDied","Data":"9283a323074cee9e3e1d05138f65030d145f0263eccdf15120558340835a18b2"} Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.930357 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"899e251c-2b43-4c26-a918-82e6444d9d21","Type":"ContainerDied","Data":"68d2b6a6d953017771001e7a9399f9236f31eabbe3976d23c8ed56f869f81346"} Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.930378 4696 scope.go:117] "RemoveContainer" containerID="9283a323074cee9e3e1d05138f65030d145f0263eccdf15120558340835a18b2" Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.932299 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/899e251c-2b43-4c26-a918-82e6444d9d21-config-data" (OuterVolumeSpecName: "config-data") pod "899e251c-2b43-4c26-a918-82e6444d9d21" (UID: "899e251c-2b43-4c26-a918-82e6444d9d21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.935763 4696 generic.go:334] "Generic (PLEG): container finished" podID="974761ea-3578-41b3-8e25-6232883d5b4c" containerID="5804bcf171d5dbd8e8eecb1471954055af1bd27a682572f6b10201ef590edeaf" exitCode=0 Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.935816 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"974761ea-3578-41b3-8e25-6232883d5b4c","Type":"ContainerDied","Data":"5804bcf171d5dbd8e8eecb1471954055af1bd27a682572f6b10201ef590edeaf"} Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.935931 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/899e251c-2b43-4c26-a918-82e6444d9d21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "899e251c-2b43-4c26-a918-82e6444d9d21" (UID: "899e251c-2b43-4c26-a918-82e6444d9d21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.986598 4696 scope.go:117] "RemoveContainer" containerID="583e2d8265ba730d41af1afcd90ca447224b63958b302741a9bfe84f3e92cd93" Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.992081 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/899e251c-2b43-4c26-a918-82e6444d9d21-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "899e251c-2b43-4c26-a918-82e6444d9d21" (UID: "899e251c-2b43-4c26-a918-82e6444d9d21"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.996203 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.999564 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/974761ea-3578-41b3-8e25-6232883d5b4c-internal-tls-certs\") pod \"974761ea-3578-41b3-8e25-6232883d5b4c\" (UID: \"974761ea-3578-41b3-8e25-6232883d5b4c\") " Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.999604 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq7jm\" (UniqueName: \"kubernetes.io/projected/974761ea-3578-41b3-8e25-6232883d5b4c-kube-api-access-zq7jm\") pod \"974761ea-3578-41b3-8e25-6232883d5b4c\" (UID: \"974761ea-3578-41b3-8e25-6232883d5b4c\") " Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.999629 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/974761ea-3578-41b3-8e25-6232883d5b4c-combined-ca-bundle\") pod \"974761ea-3578-41b3-8e25-6232883d5b4c\" (UID: \"974761ea-3578-41b3-8e25-6232883d5b4c\") " Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.999756 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/974761ea-3578-41b3-8e25-6232883d5b4c-public-tls-certs\") pod \"974761ea-3578-41b3-8e25-6232883d5b4c\" (UID: \"974761ea-3578-41b3-8e25-6232883d5b4c\") " Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.999809 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/974761ea-3578-41b3-8e25-6232883d5b4c-config-data\") pod \"974761ea-3578-41b3-8e25-6232883d5b4c\" (UID: \"974761ea-3578-41b3-8e25-6232883d5b4c\") " Mar 18 16:00:49 crc kubenswrapper[4696]: I0318 16:00:49.999914 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/974761ea-3578-41b3-8e25-6232883d5b4c-logs\") pod \"974761ea-3578-41b3-8e25-6232883d5b4c\" (UID: \"974761ea-3578-41b3-8e25-6232883d5b4c\") " Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.000717 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/899e251c-2b43-4c26-a918-82e6444d9d21-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.000745 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/899e251c-2b43-4c26-a918-82e6444d9d21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.000760 4696 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/899e251c-2b43-4c26-a918-82e6444d9d21-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.000772 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x77vs\" (UniqueName: \"kubernetes.io/projected/899e251c-2b43-4c26-a918-82e6444d9d21-kube-api-access-x77vs\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.000781 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/899e251c-2b43-4c26-a918-82e6444d9d21-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.001354 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/974761ea-3578-41b3-8e25-6232883d5b4c-logs" (OuterVolumeSpecName: "logs") pod "974761ea-3578-41b3-8e25-6232883d5b4c" (UID: "974761ea-3578-41b3-8e25-6232883d5b4c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.015776 4696 scope.go:117] "RemoveContainer" containerID="9283a323074cee9e3e1d05138f65030d145f0263eccdf15120558340835a18b2" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.015862 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/974761ea-3578-41b3-8e25-6232883d5b4c-kube-api-access-zq7jm" (OuterVolumeSpecName: "kube-api-access-zq7jm") pod "974761ea-3578-41b3-8e25-6232883d5b4c" (UID: "974761ea-3578-41b3-8e25-6232883d5b4c"). InnerVolumeSpecName "kube-api-access-zq7jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:50 crc kubenswrapper[4696]: E0318 16:00:50.020383 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9283a323074cee9e3e1d05138f65030d145f0263eccdf15120558340835a18b2\": container with ID starting with 9283a323074cee9e3e1d05138f65030d145f0263eccdf15120558340835a18b2 not found: ID does not exist" containerID="9283a323074cee9e3e1d05138f65030d145f0263eccdf15120558340835a18b2" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.020444 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9283a323074cee9e3e1d05138f65030d145f0263eccdf15120558340835a18b2"} err="failed to get container status \"9283a323074cee9e3e1d05138f65030d145f0263eccdf15120558340835a18b2\": rpc error: code = NotFound desc = could not find container \"9283a323074cee9e3e1d05138f65030d145f0263eccdf15120558340835a18b2\": container with ID starting with 9283a323074cee9e3e1d05138f65030d145f0263eccdf15120558340835a18b2 not found: ID does not exist" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.020477 4696 scope.go:117] "RemoveContainer" containerID="583e2d8265ba730d41af1afcd90ca447224b63958b302741a9bfe84f3e92cd93" Mar 18 16:00:50 crc kubenswrapper[4696]: E0318 16:00:50.028279 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"583e2d8265ba730d41af1afcd90ca447224b63958b302741a9bfe84f3e92cd93\": container with ID starting with 583e2d8265ba730d41af1afcd90ca447224b63958b302741a9bfe84f3e92cd93 not found: ID does not exist" containerID="583e2d8265ba730d41af1afcd90ca447224b63958b302741a9bfe84f3e92cd93" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.030181 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"583e2d8265ba730d41af1afcd90ca447224b63958b302741a9bfe84f3e92cd93"} err="failed to get container status \"583e2d8265ba730d41af1afcd90ca447224b63958b302741a9bfe84f3e92cd93\": rpc error: code = NotFound desc = could not find container \"583e2d8265ba730d41af1afcd90ca447224b63958b302741a9bfe84f3e92cd93\": container with ID starting with 583e2d8265ba730d41af1afcd90ca447224b63958b302741a9bfe84f3e92cd93 not found: ID does not exist" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.037115 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/974761ea-3578-41b3-8e25-6232883d5b4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "974761ea-3578-41b3-8e25-6232883d5b4c" (UID: "974761ea-3578-41b3-8e25-6232883d5b4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.058023 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/974761ea-3578-41b3-8e25-6232883d5b4c-config-data" (OuterVolumeSpecName: "config-data") pod "974761ea-3578-41b3-8e25-6232883d5b4c" (UID: "974761ea-3578-41b3-8e25-6232883d5b4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.080741 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/974761ea-3578-41b3-8e25-6232883d5b4c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "974761ea-3578-41b3-8e25-6232883d5b4c" (UID: "974761ea-3578-41b3-8e25-6232883d5b4c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.091097 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/974761ea-3578-41b3-8e25-6232883d5b4c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "974761ea-3578-41b3-8e25-6232883d5b4c" (UID: "974761ea-3578-41b3-8e25-6232883d5b4c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.102862 4696 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/974761ea-3578-41b3-8e25-6232883d5b4c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.102901 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/974761ea-3578-41b3-8e25-6232883d5b4c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.102913 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/974761ea-3578-41b3-8e25-6232883d5b4c-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.102922 4696 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/974761ea-3578-41b3-8e25-6232883d5b4c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.102933 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/974761ea-3578-41b3-8e25-6232883d5b4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.102944 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq7jm\" (UniqueName: \"kubernetes.io/projected/974761ea-3578-41b3-8e25-6232883d5b4c-kube-api-access-zq7jm\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.261252 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.268936 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.291866 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:00:50 crc kubenswrapper[4696]: E0318 16:00:50.292494 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="974761ea-3578-41b3-8e25-6232883d5b4c" containerName="nova-api-api" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.292586 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="974761ea-3578-41b3-8e25-6232883d5b4c" containerName="nova-api-api" Mar 18 16:00:50 crc kubenswrapper[4696]: E0318 16:00:50.292621 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="899e251c-2b43-4c26-a918-82e6444d9d21" containerName="nova-metadata-metadata" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.292630 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="899e251c-2b43-4c26-a918-82e6444d9d21" containerName="nova-metadata-metadata" Mar 18 16:00:50 crc kubenswrapper[4696]: E0318 16:00:50.292652 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d77d81-5c38-41dd-818b-103a8059e1f9" containerName="nova-manage" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.292660 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d77d81-5c38-41dd-818b-103a8059e1f9" containerName="nova-manage" Mar 18 16:00:50 crc kubenswrapper[4696]: E0318 16:00:50.292672 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="974761ea-3578-41b3-8e25-6232883d5b4c" containerName="nova-api-log" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.292679 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="974761ea-3578-41b3-8e25-6232883d5b4c" containerName="nova-api-log" Mar 18 16:00:50 crc kubenswrapper[4696]: E0318 16:00:50.292700 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="899e251c-2b43-4c26-a918-82e6444d9d21" containerName="nova-metadata-log" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.292707 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="899e251c-2b43-4c26-a918-82e6444d9d21" containerName="nova-metadata-log" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.293000 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="974761ea-3578-41b3-8e25-6232883d5b4c" containerName="nova-api-api" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.293020 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="974761ea-3578-41b3-8e25-6232883d5b4c" containerName="nova-api-log" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.293038 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="899e251c-2b43-4c26-a918-82e6444d9d21" containerName="nova-metadata-metadata" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.293047 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="899e251c-2b43-4c26-a918-82e6444d9d21" containerName="nova-metadata-log" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.293062 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="99d77d81-5c38-41dd-818b-103a8059e1f9" containerName="nova-manage" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.294466 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.296828 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.296999 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.305045 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.307175 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c03e61df-341f-42de-8682-c17255ffedcb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c03e61df-341f-42de-8682-c17255ffedcb\") " pod="openstack/nova-metadata-0" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.307234 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c03e61df-341f-42de-8682-c17255ffedcb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c03e61df-341f-42de-8682-c17255ffedcb\") " pod="openstack/nova-metadata-0" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.307330 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c03e61df-341f-42de-8682-c17255ffedcb-logs\") pod \"nova-metadata-0\" (UID: \"c03e61df-341f-42de-8682-c17255ffedcb\") " pod="openstack/nova-metadata-0" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.307387 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdxd4\" (UniqueName: \"kubernetes.io/projected/c03e61df-341f-42de-8682-c17255ffedcb-kube-api-access-bdxd4\") pod \"nova-metadata-0\" (UID: \"c03e61df-341f-42de-8682-c17255ffedcb\") " pod="openstack/nova-metadata-0" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.307516 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c03e61df-341f-42de-8682-c17255ffedcb-config-data\") pod \"nova-metadata-0\" (UID: \"c03e61df-341f-42de-8682-c17255ffedcb\") " pod="openstack/nova-metadata-0" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.408436 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c03e61df-341f-42de-8682-c17255ffedcb-config-data\") pod \"nova-metadata-0\" (UID: \"c03e61df-341f-42de-8682-c17255ffedcb\") " pod="openstack/nova-metadata-0" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.408513 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c03e61df-341f-42de-8682-c17255ffedcb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c03e61df-341f-42de-8682-c17255ffedcb\") " pod="openstack/nova-metadata-0" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.408547 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c03e61df-341f-42de-8682-c17255ffedcb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c03e61df-341f-42de-8682-c17255ffedcb\") " pod="openstack/nova-metadata-0" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.408612 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c03e61df-341f-42de-8682-c17255ffedcb-logs\") pod \"nova-metadata-0\" (UID: \"c03e61df-341f-42de-8682-c17255ffedcb\") " pod="openstack/nova-metadata-0" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.408655 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdxd4\" (UniqueName: \"kubernetes.io/projected/c03e61df-341f-42de-8682-c17255ffedcb-kube-api-access-bdxd4\") pod \"nova-metadata-0\" (UID: \"c03e61df-341f-42de-8682-c17255ffedcb\") " pod="openstack/nova-metadata-0" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.409174 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c03e61df-341f-42de-8682-c17255ffedcb-logs\") pod \"nova-metadata-0\" (UID: \"c03e61df-341f-42de-8682-c17255ffedcb\") " pod="openstack/nova-metadata-0" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.412781 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c03e61df-341f-42de-8682-c17255ffedcb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c03e61df-341f-42de-8682-c17255ffedcb\") " pod="openstack/nova-metadata-0" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.412869 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c03e61df-341f-42de-8682-c17255ffedcb-config-data\") pod \"nova-metadata-0\" (UID: \"c03e61df-341f-42de-8682-c17255ffedcb\") " pod="openstack/nova-metadata-0" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.414260 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c03e61df-341f-42de-8682-c17255ffedcb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c03e61df-341f-42de-8682-c17255ffedcb\") " pod="openstack/nova-metadata-0" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.428264 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdxd4\" (UniqueName: \"kubernetes.io/projected/c03e61df-341f-42de-8682-c17255ffedcb-kube-api-access-bdxd4\") pod \"nova-metadata-0\" (UID: \"c03e61df-341f-42de-8682-c17255ffedcb\") " pod="openstack/nova-metadata-0" Mar 18 16:00:50 crc kubenswrapper[4696]: E0318 16:00:50.621710 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4c2f74c9956957e3d1950d932caab7d355c51e0b051c1e1863670372e3daf7c0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 16:00:50 crc kubenswrapper[4696]: E0318 16:00:50.623566 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4c2f74c9956957e3d1950d932caab7d355c51e0b051c1e1863670372e3daf7c0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 16:00:50 crc kubenswrapper[4696]: E0318 16:00:50.625169 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4c2f74c9956957e3d1950d932caab7d355c51e0b051c1e1863670372e3daf7c0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 16:00:50 crc kubenswrapper[4696]: E0318 16:00:50.625214 4696 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="30a10886-71b7-48b8-af79-859f4c134f28" containerName="nova-scheduler-scheduler" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.707693 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.953672 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"974761ea-3578-41b3-8e25-6232883d5b4c","Type":"ContainerDied","Data":"88386fa8d5357d991c40a6bfcfffb2da44ff5836c8dec4bc30d505bc4d8fdeb9"} Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.953938 4696 scope.go:117] "RemoveContainer" containerID="5804bcf171d5dbd8e8eecb1471954055af1bd27a682572f6b10201ef590edeaf" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.954088 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:00:50 crc kubenswrapper[4696]: I0318 16:00:50.989370 4696 scope.go:117] "RemoveContainer" containerID="2da7af7386e4a229234247a7f16015eb668a9838b92490bc6e373fc51982d5c3" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.044379 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.071531 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.083408 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.085213 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.088965 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.089301 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.094138 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.101251 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.124775 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/449c7dea-20e2-4b99-bec6-e3287082418a-logs\") pod \"nova-api-0\" (UID: \"449c7dea-20e2-4b99-bec6-e3287082418a\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.124831 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/449c7dea-20e2-4b99-bec6-e3287082418a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"449c7dea-20e2-4b99-bec6-e3287082418a\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.124858 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449c7dea-20e2-4b99-bec6-e3287082418a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"449c7dea-20e2-4b99-bec6-e3287082418a\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.124925 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4dbc\" (UniqueName: \"kubernetes.io/projected/449c7dea-20e2-4b99-bec6-e3287082418a-kube-api-access-w4dbc\") pod \"nova-api-0\" (UID: \"449c7dea-20e2-4b99-bec6-e3287082418a\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.125411 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/449c7dea-20e2-4b99-bec6-e3287082418a-public-tls-certs\") pod \"nova-api-0\" (UID: \"449c7dea-20e2-4b99-bec6-e3287082418a\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.125441 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/449c7dea-20e2-4b99-bec6-e3287082418a-config-data\") pod \"nova-api-0\" (UID: \"449c7dea-20e2-4b99-bec6-e3287082418a\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4696]: W0318 16:00:51.203065 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc03e61df_341f_42de_8682_c17255ffedcb.slice/crio-e1f087d01bc1aef93b7d398a17745ffb3bec0dcfca637732a0b3c5eddf6232e7 WatchSource:0}: Error finding container e1f087d01bc1aef93b7d398a17745ffb3bec0dcfca637732a0b3c5eddf6232e7: Status 404 returned error can't find the container with id e1f087d01bc1aef93b7d398a17745ffb3bec0dcfca637732a0b3c5eddf6232e7 Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.210210 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.228076 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/449c7dea-20e2-4b99-bec6-e3287082418a-public-tls-certs\") pod \"nova-api-0\" (UID: \"449c7dea-20e2-4b99-bec6-e3287082418a\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.228134 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/449c7dea-20e2-4b99-bec6-e3287082418a-config-data\") pod \"nova-api-0\" (UID: \"449c7dea-20e2-4b99-bec6-e3287082418a\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.228186 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/449c7dea-20e2-4b99-bec6-e3287082418a-logs\") pod \"nova-api-0\" (UID: \"449c7dea-20e2-4b99-bec6-e3287082418a\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.228208 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/449c7dea-20e2-4b99-bec6-e3287082418a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"449c7dea-20e2-4b99-bec6-e3287082418a\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.228234 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449c7dea-20e2-4b99-bec6-e3287082418a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"449c7dea-20e2-4b99-bec6-e3287082418a\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.228325 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4dbc\" (UniqueName: \"kubernetes.io/projected/449c7dea-20e2-4b99-bec6-e3287082418a-kube-api-access-w4dbc\") pod \"nova-api-0\" (UID: \"449c7dea-20e2-4b99-bec6-e3287082418a\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.229396 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/449c7dea-20e2-4b99-bec6-e3287082418a-logs\") pod \"nova-api-0\" (UID: \"449c7dea-20e2-4b99-bec6-e3287082418a\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.235751 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/449c7dea-20e2-4b99-bec6-e3287082418a-public-tls-certs\") pod \"nova-api-0\" (UID: \"449c7dea-20e2-4b99-bec6-e3287082418a\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.236102 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/449c7dea-20e2-4b99-bec6-e3287082418a-config-data\") pod \"nova-api-0\" (UID: \"449c7dea-20e2-4b99-bec6-e3287082418a\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.237395 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/449c7dea-20e2-4b99-bec6-e3287082418a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"449c7dea-20e2-4b99-bec6-e3287082418a\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.238239 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449c7dea-20e2-4b99-bec6-e3287082418a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"449c7dea-20e2-4b99-bec6-e3287082418a\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.248653 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4dbc\" (UniqueName: \"kubernetes.io/projected/449c7dea-20e2-4b99-bec6-e3287082418a-kube-api-access-w4dbc\") pod \"nova-api-0\" (UID: \"449c7dea-20e2-4b99-bec6-e3287082418a\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.402916 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.611051 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="899e251c-2b43-4c26-a918-82e6444d9d21" path="/var/lib/kubelet/pods/899e251c-2b43-4c26-a918-82e6444d9d21/volumes" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.617016 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="974761ea-3578-41b3-8e25-6232883d5b4c" path="/var/lib/kubelet/pods/974761ea-3578-41b3-8e25-6232883d5b4c/volumes" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.775869 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.852249 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30a10886-71b7-48b8-af79-859f4c134f28-config-data\") pod \"30a10886-71b7-48b8-af79-859f4c134f28\" (UID: \"30a10886-71b7-48b8-af79-859f4c134f28\") " Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.852365 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdlmp\" (UniqueName: \"kubernetes.io/projected/30a10886-71b7-48b8-af79-859f4c134f28-kube-api-access-sdlmp\") pod \"30a10886-71b7-48b8-af79-859f4c134f28\" (UID: \"30a10886-71b7-48b8-af79-859f4c134f28\") " Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.853222 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30a10886-71b7-48b8-af79-859f4c134f28-combined-ca-bundle\") pod \"30a10886-71b7-48b8-af79-859f4c134f28\" (UID: \"30a10886-71b7-48b8-af79-859f4c134f28\") " Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.858505 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30a10886-71b7-48b8-af79-859f4c134f28-kube-api-access-sdlmp" (OuterVolumeSpecName: "kube-api-access-sdlmp") pod "30a10886-71b7-48b8-af79-859f4c134f28" (UID: "30a10886-71b7-48b8-af79-859f4c134f28"). InnerVolumeSpecName "kube-api-access-sdlmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.889159 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30a10886-71b7-48b8-af79-859f4c134f28-config-data" (OuterVolumeSpecName: "config-data") pod "30a10886-71b7-48b8-af79-859f4c134f28" (UID: "30a10886-71b7-48b8-af79-859f4c134f28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.893201 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30a10886-71b7-48b8-af79-859f4c134f28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30a10886-71b7-48b8-af79-859f4c134f28" (UID: "30a10886-71b7-48b8-af79-859f4c134f28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.939820 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.960179 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30a10886-71b7-48b8-af79-859f4c134f28-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.960221 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdlmp\" (UniqueName: \"kubernetes.io/projected/30a10886-71b7-48b8-af79-859f4c134f28-kube-api-access-sdlmp\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.960237 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30a10886-71b7-48b8-af79-859f4c134f28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.982670 4696 generic.go:334] "Generic (PLEG): container finished" podID="30a10886-71b7-48b8-af79-859f4c134f28" containerID="4c2f74c9956957e3d1950d932caab7d355c51e0b051c1e1863670372e3daf7c0" exitCode=0 Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.982806 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.982936 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30a10886-71b7-48b8-af79-859f4c134f28","Type":"ContainerDied","Data":"4c2f74c9956957e3d1950d932caab7d355c51e0b051c1e1863670372e3daf7c0"} Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.982995 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30a10886-71b7-48b8-af79-859f4c134f28","Type":"ContainerDied","Data":"81814b15831039d10d39091be6c53a085189aae74daf95ac1cd21d2a16886774"} Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.983017 4696 scope.go:117] "RemoveContainer" containerID="4c2f74c9956957e3d1950d932caab7d355c51e0b051c1e1863670372e3daf7c0" Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.993015 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c03e61df-341f-42de-8682-c17255ffedcb","Type":"ContainerStarted","Data":"b9f5ee73628db1c5f8c195b4d3bfc827c01805c86d9839215eb1f52958a850af"} Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.993075 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c03e61df-341f-42de-8682-c17255ffedcb","Type":"ContainerStarted","Data":"09982d934f26ecb7a43edcb84e8d03ae167f63f2a836ab3dcbfe3acac38e4d6f"} Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.993094 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c03e61df-341f-42de-8682-c17255ffedcb","Type":"ContainerStarted","Data":"e1f087d01bc1aef93b7d398a17745ffb3bec0dcfca637732a0b3c5eddf6232e7"} Mar 18 16:00:51 crc kubenswrapper[4696]: I0318 16:00:51.997693 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"449c7dea-20e2-4b99-bec6-e3287082418a","Type":"ContainerStarted","Data":"a3b0f1dc61287a41c223f24124a7992021beb3104dddd116f7ca604755887585"} Mar 18 16:00:52 crc kubenswrapper[4696]: I0318 16:00:52.016089 4696 scope.go:117] "RemoveContainer" containerID="4c2f74c9956957e3d1950d932caab7d355c51e0b051c1e1863670372e3daf7c0" Mar 18 16:00:52 crc kubenswrapper[4696]: E0318 16:00:52.018410 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c2f74c9956957e3d1950d932caab7d355c51e0b051c1e1863670372e3daf7c0\": container with ID starting with 4c2f74c9956957e3d1950d932caab7d355c51e0b051c1e1863670372e3daf7c0 not found: ID does not exist" containerID="4c2f74c9956957e3d1950d932caab7d355c51e0b051c1e1863670372e3daf7c0" Mar 18 16:00:52 crc kubenswrapper[4696]: I0318 16:00:52.018540 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c2f74c9956957e3d1950d932caab7d355c51e0b051c1e1863670372e3daf7c0"} err="failed to get container status \"4c2f74c9956957e3d1950d932caab7d355c51e0b051c1e1863670372e3daf7c0\": rpc error: code = NotFound desc = could not find container \"4c2f74c9956957e3d1950d932caab7d355c51e0b051c1e1863670372e3daf7c0\": container with ID starting with 4c2f74c9956957e3d1950d932caab7d355c51e0b051c1e1863670372e3daf7c0 not found: ID does not exist" Mar 18 16:00:52 crc kubenswrapper[4696]: I0318 16:00:52.048956 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.048926007 podStartE2EDuration="2.048926007s" podCreationTimestamp="2026-03-18 16:00:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:52.016814398 +0000 UTC m=+1495.022988624" watchObservedRunningTime="2026-03-18 16:00:52.048926007 +0000 UTC m=+1495.055100213" Mar 18 16:00:52 crc kubenswrapper[4696]: I0318 16:00:52.071450 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:00:52 crc kubenswrapper[4696]: I0318 16:00:52.093457 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:00:52 crc kubenswrapper[4696]: I0318 16:00:52.102558 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:00:52 crc kubenswrapper[4696]: E0318 16:00:52.103171 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30a10886-71b7-48b8-af79-859f4c134f28" containerName="nova-scheduler-scheduler" Mar 18 16:00:52 crc kubenswrapper[4696]: I0318 16:00:52.103197 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="30a10886-71b7-48b8-af79-859f4c134f28" containerName="nova-scheduler-scheduler" Mar 18 16:00:52 crc kubenswrapper[4696]: I0318 16:00:52.103494 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="30a10886-71b7-48b8-af79-859f4c134f28" containerName="nova-scheduler-scheduler" Mar 18 16:00:52 crc kubenswrapper[4696]: I0318 16:00:52.104248 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:00:52 crc kubenswrapper[4696]: I0318 16:00:52.107851 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 16:00:52 crc kubenswrapper[4696]: I0318 16:00:52.114314 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:00:52 crc kubenswrapper[4696]: I0318 16:00:52.163736 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f49d5ebf-6c40-4bb4-bc0a-0ce72839c86a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f49d5ebf-6c40-4bb4-bc0a-0ce72839c86a\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:52 crc kubenswrapper[4696]: I0318 16:00:52.163885 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f49d5ebf-6c40-4bb4-bc0a-0ce72839c86a-config-data\") pod \"nova-scheduler-0\" (UID: \"f49d5ebf-6c40-4bb4-bc0a-0ce72839c86a\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:52 crc kubenswrapper[4696]: I0318 16:00:52.163941 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hhsq\" (UniqueName: \"kubernetes.io/projected/f49d5ebf-6c40-4bb4-bc0a-0ce72839c86a-kube-api-access-5hhsq\") pod \"nova-scheduler-0\" (UID: \"f49d5ebf-6c40-4bb4-bc0a-0ce72839c86a\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:52 crc kubenswrapper[4696]: I0318 16:00:52.266025 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f49d5ebf-6c40-4bb4-bc0a-0ce72839c86a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f49d5ebf-6c40-4bb4-bc0a-0ce72839c86a\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:52 crc kubenswrapper[4696]: I0318 16:00:52.266190 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f49d5ebf-6c40-4bb4-bc0a-0ce72839c86a-config-data\") pod \"nova-scheduler-0\" (UID: \"f49d5ebf-6c40-4bb4-bc0a-0ce72839c86a\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:52 crc kubenswrapper[4696]: I0318 16:00:52.266252 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hhsq\" (UniqueName: \"kubernetes.io/projected/f49d5ebf-6c40-4bb4-bc0a-0ce72839c86a-kube-api-access-5hhsq\") pod \"nova-scheduler-0\" (UID: \"f49d5ebf-6c40-4bb4-bc0a-0ce72839c86a\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:52 crc kubenswrapper[4696]: I0318 16:00:52.271060 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f49d5ebf-6c40-4bb4-bc0a-0ce72839c86a-config-data\") pod \"nova-scheduler-0\" (UID: \"f49d5ebf-6c40-4bb4-bc0a-0ce72839c86a\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:52 crc kubenswrapper[4696]: I0318 16:00:52.271240 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f49d5ebf-6c40-4bb4-bc0a-0ce72839c86a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f49d5ebf-6c40-4bb4-bc0a-0ce72839c86a\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:52 crc kubenswrapper[4696]: I0318 16:00:52.284294 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hhsq\" (UniqueName: \"kubernetes.io/projected/f49d5ebf-6c40-4bb4-bc0a-0ce72839c86a-kube-api-access-5hhsq\") pod \"nova-scheduler-0\" (UID: \"f49d5ebf-6c40-4bb4-bc0a-0ce72839c86a\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:52 crc kubenswrapper[4696]: I0318 16:00:52.427361 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:00:52 crc kubenswrapper[4696]: I0318 16:00:52.934836 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:00:53 crc kubenswrapper[4696]: I0318 16:00:53.010785 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f49d5ebf-6c40-4bb4-bc0a-0ce72839c86a","Type":"ContainerStarted","Data":"59c9d82dbb5e2f142541b6e70de506fe6d0300c37d379ae6bc0055cb3057ed74"} Mar 18 16:00:53 crc kubenswrapper[4696]: I0318 16:00:53.014481 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"449c7dea-20e2-4b99-bec6-e3287082418a","Type":"ContainerStarted","Data":"427d710c2235b34cdc08b504711d17844d74f35408ffd9d62439afddef1017e0"} Mar 18 16:00:53 crc kubenswrapper[4696]: I0318 16:00:53.014571 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"449c7dea-20e2-4b99-bec6-e3287082418a","Type":"ContainerStarted","Data":"7c3432c59e41cd64783796639b304102ec74b8d5b21102404a708ef43160bf8f"} Mar 18 16:00:53 crc kubenswrapper[4696]: I0318 16:00:53.059872 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.059831322 podStartE2EDuration="2.059831322s" podCreationTimestamp="2026-03-18 16:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:53.041816358 +0000 UTC m=+1496.047990594" watchObservedRunningTime="2026-03-18 16:00:53.059831322 +0000 UTC m=+1496.066005568" Mar 18 16:00:53 crc kubenswrapper[4696]: I0318 16:00:53.623372 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30a10886-71b7-48b8-af79-859f4c134f28" path="/var/lib/kubelet/pods/30a10886-71b7-48b8-af79-859f4c134f28/volumes" Mar 18 16:00:54 crc kubenswrapper[4696]: I0318 16:00:54.027935 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f49d5ebf-6c40-4bb4-bc0a-0ce72839c86a","Type":"ContainerStarted","Data":"2f850826f120aedba1061e111f9e305b44abd00b39ab746b26cf15f9fc3af7c7"} Mar 18 16:00:54 crc kubenswrapper[4696]: I0318 16:00:54.066785 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.066748375 podStartE2EDuration="2.066748375s" podCreationTimestamp="2026-03-18 16:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:54.046629138 +0000 UTC m=+1497.052803364" watchObservedRunningTime="2026-03-18 16:00:54.066748375 +0000 UTC m=+1497.072922611" Mar 18 16:00:56 crc kubenswrapper[4696]: I0318 16:00:56.044423 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 16:00:57 crc kubenswrapper[4696]: I0318 16:00:57.428138 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 16:01:00 crc kubenswrapper[4696]: I0318 16:01:00.135758 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29564161-mrc5s"] Mar 18 16:01:00 crc kubenswrapper[4696]: I0318 16:01:00.137989 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564161-mrc5s" Mar 18 16:01:00 crc kubenswrapper[4696]: I0318 16:01:00.158860 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29564161-mrc5s"] Mar 18 16:01:00 crc kubenswrapper[4696]: I0318 16:01:00.231437 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbn66\" (UniqueName: \"kubernetes.io/projected/e8aacc65-eb04-4cb3-8ab2-fb34b6769db4-kube-api-access-pbn66\") pod \"keystone-cron-29564161-mrc5s\" (UID: \"e8aacc65-eb04-4cb3-8ab2-fb34b6769db4\") " pod="openstack/keystone-cron-29564161-mrc5s" Mar 18 16:01:00 crc kubenswrapper[4696]: I0318 16:01:00.231502 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8aacc65-eb04-4cb3-8ab2-fb34b6769db4-combined-ca-bundle\") pod \"keystone-cron-29564161-mrc5s\" (UID: \"e8aacc65-eb04-4cb3-8ab2-fb34b6769db4\") " pod="openstack/keystone-cron-29564161-mrc5s" Mar 18 16:01:00 crc kubenswrapper[4696]: I0318 16:01:00.231586 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8aacc65-eb04-4cb3-8ab2-fb34b6769db4-fernet-keys\") pod \"keystone-cron-29564161-mrc5s\" (UID: \"e8aacc65-eb04-4cb3-8ab2-fb34b6769db4\") " pod="openstack/keystone-cron-29564161-mrc5s" Mar 18 16:01:00 crc kubenswrapper[4696]: I0318 16:01:00.231607 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8aacc65-eb04-4cb3-8ab2-fb34b6769db4-config-data\") pod \"keystone-cron-29564161-mrc5s\" (UID: \"e8aacc65-eb04-4cb3-8ab2-fb34b6769db4\") " pod="openstack/keystone-cron-29564161-mrc5s" Mar 18 16:01:00 crc kubenswrapper[4696]: I0318 16:01:00.333744 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbn66\" (UniqueName: \"kubernetes.io/projected/e8aacc65-eb04-4cb3-8ab2-fb34b6769db4-kube-api-access-pbn66\") pod \"keystone-cron-29564161-mrc5s\" (UID: \"e8aacc65-eb04-4cb3-8ab2-fb34b6769db4\") " pod="openstack/keystone-cron-29564161-mrc5s" Mar 18 16:01:00 crc kubenswrapper[4696]: I0318 16:01:00.333797 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8aacc65-eb04-4cb3-8ab2-fb34b6769db4-combined-ca-bundle\") pod \"keystone-cron-29564161-mrc5s\" (UID: \"e8aacc65-eb04-4cb3-8ab2-fb34b6769db4\") " pod="openstack/keystone-cron-29564161-mrc5s" Mar 18 16:01:00 crc kubenswrapper[4696]: I0318 16:01:00.333884 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8aacc65-eb04-4cb3-8ab2-fb34b6769db4-fernet-keys\") pod \"keystone-cron-29564161-mrc5s\" (UID: \"e8aacc65-eb04-4cb3-8ab2-fb34b6769db4\") " pod="openstack/keystone-cron-29564161-mrc5s" Mar 18 16:01:00 crc kubenswrapper[4696]: I0318 16:01:00.333907 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8aacc65-eb04-4cb3-8ab2-fb34b6769db4-config-data\") pod \"keystone-cron-29564161-mrc5s\" (UID: \"e8aacc65-eb04-4cb3-8ab2-fb34b6769db4\") " pod="openstack/keystone-cron-29564161-mrc5s" Mar 18 16:01:00 crc kubenswrapper[4696]: I0318 16:01:00.341647 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8aacc65-eb04-4cb3-8ab2-fb34b6769db4-fernet-keys\") pod \"keystone-cron-29564161-mrc5s\" (UID: \"e8aacc65-eb04-4cb3-8ab2-fb34b6769db4\") " pod="openstack/keystone-cron-29564161-mrc5s" Mar 18 16:01:00 crc kubenswrapper[4696]: I0318 16:01:00.341730 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8aacc65-eb04-4cb3-8ab2-fb34b6769db4-config-data\") pod \"keystone-cron-29564161-mrc5s\" (UID: \"e8aacc65-eb04-4cb3-8ab2-fb34b6769db4\") " pod="openstack/keystone-cron-29564161-mrc5s" Mar 18 16:01:00 crc kubenswrapper[4696]: I0318 16:01:00.346198 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8aacc65-eb04-4cb3-8ab2-fb34b6769db4-combined-ca-bundle\") pod \"keystone-cron-29564161-mrc5s\" (UID: \"e8aacc65-eb04-4cb3-8ab2-fb34b6769db4\") " pod="openstack/keystone-cron-29564161-mrc5s" Mar 18 16:01:00 crc kubenswrapper[4696]: I0318 16:01:00.349696 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbn66\" (UniqueName: \"kubernetes.io/projected/e8aacc65-eb04-4cb3-8ab2-fb34b6769db4-kube-api-access-pbn66\") pod \"keystone-cron-29564161-mrc5s\" (UID: \"e8aacc65-eb04-4cb3-8ab2-fb34b6769db4\") " pod="openstack/keystone-cron-29564161-mrc5s" Mar 18 16:01:00 crc kubenswrapper[4696]: I0318 16:01:00.469704 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564161-mrc5s" Mar 18 16:01:00 crc kubenswrapper[4696]: I0318 16:01:00.710848 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 16:01:00 crc kubenswrapper[4696]: I0318 16:01:00.711230 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 16:01:00 crc kubenswrapper[4696]: I0318 16:01:00.975167 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29564161-mrc5s"] Mar 18 16:01:00 crc kubenswrapper[4696]: W0318 16:01:00.976948 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8aacc65_eb04_4cb3_8ab2_fb34b6769db4.slice/crio-4a229387027c5ca4d4079f095376582f411767f02bdbb3a2b7a23a95e868171d WatchSource:0}: Error finding container 4a229387027c5ca4d4079f095376582f411767f02bdbb3a2b7a23a95e868171d: Status 404 returned error can't find the container with id 4a229387027c5ca4d4079f095376582f411767f02bdbb3a2b7a23a95e868171d Mar 18 16:01:01 crc kubenswrapper[4696]: I0318 16:01:01.103953 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564161-mrc5s" event={"ID":"e8aacc65-eb04-4cb3-8ab2-fb34b6769db4","Type":"ContainerStarted","Data":"4a229387027c5ca4d4079f095376582f411767f02bdbb3a2b7a23a95e868171d"} Mar 18 16:01:01 crc kubenswrapper[4696]: I0318 16:01:01.403596 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 16:01:01 crc kubenswrapper[4696]: I0318 16:01:01.405150 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 16:01:01 crc kubenswrapper[4696]: I0318 16:01:01.736021 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c03e61df-341f-42de-8682-c17255ffedcb" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:01:01 crc kubenswrapper[4696]: I0318 16:01:01.736798 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c03e61df-341f-42de-8682-c17255ffedcb" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:01:02 crc kubenswrapper[4696]: I0318 16:01:02.115661 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564161-mrc5s" event={"ID":"e8aacc65-eb04-4cb3-8ab2-fb34b6769db4","Type":"ContainerStarted","Data":"a1b0336770064193850d5ba6d921f92fa1d0ec40f65de0163e6fa24871b66dc7"} Mar 18 16:01:02 crc kubenswrapper[4696]: I0318 16:01:02.419718 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="449c7dea-20e2-4b99-bec6-e3287082418a" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:01:02 crc kubenswrapper[4696]: I0318 16:01:02.419828 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="449c7dea-20e2-4b99-bec6-e3287082418a" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:01:02 crc kubenswrapper[4696]: I0318 16:01:02.427892 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 16:01:02 crc kubenswrapper[4696]: I0318 16:01:02.463014 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 16:01:02 crc kubenswrapper[4696]: I0318 16:01:02.491325 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29564161-mrc5s" podStartSLOduration=2.491303314 podStartE2EDuration="2.491303314s" podCreationTimestamp="2026-03-18 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:02.136828639 +0000 UTC m=+1505.143002845" watchObservedRunningTime="2026-03-18 16:01:02.491303314 +0000 UTC m=+1505.497477520" Mar 18 16:01:03 crc kubenswrapper[4696]: I0318 16:01:03.171142 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 16:01:04 crc kubenswrapper[4696]: I0318 16:01:04.140980 4696 generic.go:334] "Generic (PLEG): container finished" podID="e8aacc65-eb04-4cb3-8ab2-fb34b6769db4" containerID="a1b0336770064193850d5ba6d921f92fa1d0ec40f65de0163e6fa24871b66dc7" exitCode=0 Mar 18 16:01:04 crc kubenswrapper[4696]: I0318 16:01:04.141070 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564161-mrc5s" event={"ID":"e8aacc65-eb04-4cb3-8ab2-fb34b6769db4","Type":"ContainerDied","Data":"a1b0336770064193850d5ba6d921f92fa1d0ec40f65de0163e6fa24871b66dc7"} Mar 18 16:01:05 crc kubenswrapper[4696]: I0318 16:01:05.534951 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564161-mrc5s" Mar 18 16:01:05 crc kubenswrapper[4696]: I0318 16:01:05.563100 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbn66\" (UniqueName: \"kubernetes.io/projected/e8aacc65-eb04-4cb3-8ab2-fb34b6769db4-kube-api-access-pbn66\") pod \"e8aacc65-eb04-4cb3-8ab2-fb34b6769db4\" (UID: \"e8aacc65-eb04-4cb3-8ab2-fb34b6769db4\") " Mar 18 16:01:05 crc kubenswrapper[4696]: I0318 16:01:05.563307 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8aacc65-eb04-4cb3-8ab2-fb34b6769db4-fernet-keys\") pod \"e8aacc65-eb04-4cb3-8ab2-fb34b6769db4\" (UID: \"e8aacc65-eb04-4cb3-8ab2-fb34b6769db4\") " Mar 18 16:01:05 crc kubenswrapper[4696]: I0318 16:01:05.563498 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8aacc65-eb04-4cb3-8ab2-fb34b6769db4-combined-ca-bundle\") pod \"e8aacc65-eb04-4cb3-8ab2-fb34b6769db4\" (UID: \"e8aacc65-eb04-4cb3-8ab2-fb34b6769db4\") " Mar 18 16:01:05 crc kubenswrapper[4696]: I0318 16:01:05.563593 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8aacc65-eb04-4cb3-8ab2-fb34b6769db4-config-data\") pod \"e8aacc65-eb04-4cb3-8ab2-fb34b6769db4\" (UID: \"e8aacc65-eb04-4cb3-8ab2-fb34b6769db4\") " Mar 18 16:01:05 crc kubenswrapper[4696]: I0318 16:01:05.569702 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8aacc65-eb04-4cb3-8ab2-fb34b6769db4-kube-api-access-pbn66" (OuterVolumeSpecName: "kube-api-access-pbn66") pod "e8aacc65-eb04-4cb3-8ab2-fb34b6769db4" (UID: "e8aacc65-eb04-4cb3-8ab2-fb34b6769db4"). InnerVolumeSpecName "kube-api-access-pbn66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:05 crc kubenswrapper[4696]: I0318 16:01:05.571947 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8aacc65-eb04-4cb3-8ab2-fb34b6769db4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e8aacc65-eb04-4cb3-8ab2-fb34b6769db4" (UID: "e8aacc65-eb04-4cb3-8ab2-fb34b6769db4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:05 crc kubenswrapper[4696]: I0318 16:01:05.593876 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8aacc65-eb04-4cb3-8ab2-fb34b6769db4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8aacc65-eb04-4cb3-8ab2-fb34b6769db4" (UID: "e8aacc65-eb04-4cb3-8ab2-fb34b6769db4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:05 crc kubenswrapper[4696]: I0318 16:01:05.630139 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8aacc65-eb04-4cb3-8ab2-fb34b6769db4-config-data" (OuterVolumeSpecName: "config-data") pod "e8aacc65-eb04-4cb3-8ab2-fb34b6769db4" (UID: "e8aacc65-eb04-4cb3-8ab2-fb34b6769db4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:05 crc kubenswrapper[4696]: I0318 16:01:05.666294 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8aacc65-eb04-4cb3-8ab2-fb34b6769db4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:05 crc kubenswrapper[4696]: I0318 16:01:05.666342 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8aacc65-eb04-4cb3-8ab2-fb34b6769db4-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:05 crc kubenswrapper[4696]: I0318 16:01:05.666351 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbn66\" (UniqueName: \"kubernetes.io/projected/e8aacc65-eb04-4cb3-8ab2-fb34b6769db4-kube-api-access-pbn66\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:05 crc kubenswrapper[4696]: I0318 16:01:05.666476 4696 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8aacc65-eb04-4cb3-8ab2-fb34b6769db4-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:06 crc kubenswrapper[4696]: I0318 16:01:06.166160 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564161-mrc5s" event={"ID":"e8aacc65-eb04-4cb3-8ab2-fb34b6769db4","Type":"ContainerDied","Data":"4a229387027c5ca4d4079f095376582f411767f02bdbb3a2b7a23a95e868171d"} Mar 18 16:01:06 crc kubenswrapper[4696]: I0318 16:01:06.166220 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a229387027c5ca4d4079f095376582f411767f02bdbb3a2b7a23a95e868171d" Mar 18 16:01:06 crc kubenswrapper[4696]: I0318 16:01:06.166226 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564161-mrc5s" Mar 18 16:01:08 crc kubenswrapper[4696]: I0318 16:01:08.708306 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 16:01:08 crc kubenswrapper[4696]: I0318 16:01:08.708671 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 16:01:09 crc kubenswrapper[4696]: I0318 16:01:09.403675 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 16:01:09 crc kubenswrapper[4696]: I0318 16:01:09.403993 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 16:01:10 crc kubenswrapper[4696]: I0318 16:01:10.714427 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 16:01:10 crc kubenswrapper[4696]: I0318 16:01:10.715157 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 16:01:10 crc kubenswrapper[4696]: I0318 16:01:10.719605 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 16:01:11 crc kubenswrapper[4696]: I0318 16:01:11.234590 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 16:01:11 crc kubenswrapper[4696]: I0318 16:01:11.412757 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 16:01:11 crc kubenswrapper[4696]: I0318 16:01:11.416569 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 16:01:11 crc kubenswrapper[4696]: I0318 16:01:11.418932 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 16:01:12 crc kubenswrapper[4696]: I0318 16:01:12.238023 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 16:01:19 crc kubenswrapper[4696]: I0318 16:01:19.742329 4696 scope.go:117] "RemoveContainer" containerID="90ffc7e8b38bf04d741b4bb6b29985262ec4a42b5698b0657694bb0b2467c041" Mar 18 16:01:20 crc kubenswrapper[4696]: I0318 16:01:20.683855 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 16:01:21 crc kubenswrapper[4696]: I0318 16:01:21.620870 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 16:01:24 crc kubenswrapper[4696]: I0318 16:01:24.991496 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="9b5b880f-8efc-483f-b734-fa854ddd30dc" containerName="rabbitmq" containerID="cri-o://763a47e190b3766069fc26ef6310bdc5c5beb355c682d3dd8ee7ed57e35b1c03" gracePeriod=604796 Mar 18 16:01:26 crc kubenswrapper[4696]: I0318 16:01:26.420012 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="0bdb4167-8754-4c20-97ea-b014ce2cafdc" containerName="rabbitmq" containerID="cri-o://12649b52449583f0f3026135104b5af446bca7a18e2d1e387356eb235a32807b" gracePeriod=604796 Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.445022 4696 generic.go:334] "Generic (PLEG): container finished" podID="9b5b880f-8efc-483f-b734-fa854ddd30dc" containerID="763a47e190b3766069fc26ef6310bdc5c5beb355c682d3dd8ee7ed57e35b1c03" exitCode=0 Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.445158 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9b5b880f-8efc-483f-b734-fa854ddd30dc","Type":"ContainerDied","Data":"763a47e190b3766069fc26ef6310bdc5c5beb355c682d3dd8ee7ed57e35b1c03"} Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.604188 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.746212 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9b5b880f-8efc-483f-b734-fa854ddd30dc-pod-info\") pod \"9b5b880f-8efc-483f-b734-fa854ddd30dc\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.746653 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9b5b880f-8efc-483f-b734-fa854ddd30dc-erlang-cookie-secret\") pod \"9b5b880f-8efc-483f-b734-fa854ddd30dc\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.746754 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9b5b880f-8efc-483f-b734-fa854ddd30dc-server-conf\") pod \"9b5b880f-8efc-483f-b734-fa854ddd30dc\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.746826 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9b5b880f-8efc-483f-b734-fa854ddd30dc-rabbitmq-erlang-cookie\") pod \"9b5b880f-8efc-483f-b734-fa854ddd30dc\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.746894 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9b5b880f-8efc-483f-b734-fa854ddd30dc-plugins-conf\") pod \"9b5b880f-8efc-483f-b734-fa854ddd30dc\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.747012 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"9b5b880f-8efc-483f-b734-fa854ddd30dc\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.747076 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b5b880f-8efc-483f-b734-fa854ddd30dc-config-data\") pod \"9b5b880f-8efc-483f-b734-fa854ddd30dc\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.747155 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9b5b880f-8efc-483f-b734-fa854ddd30dc-rabbitmq-tls\") pod \"9b5b880f-8efc-483f-b734-fa854ddd30dc\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.747261 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9b5b880f-8efc-483f-b734-fa854ddd30dc-rabbitmq-plugins\") pod \"9b5b880f-8efc-483f-b734-fa854ddd30dc\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.747360 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9b5b880f-8efc-483f-b734-fa854ddd30dc-rabbitmq-confd\") pod \"9b5b880f-8efc-483f-b734-fa854ddd30dc\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.747445 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnjnk\" (UniqueName: \"kubernetes.io/projected/9b5b880f-8efc-483f-b734-fa854ddd30dc-kube-api-access-jnjnk\") pod \"9b5b880f-8efc-483f-b734-fa854ddd30dc\" (UID: \"9b5b880f-8efc-483f-b734-fa854ddd30dc\") " Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.747508 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b5b880f-8efc-483f-b734-fa854ddd30dc-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "9b5b880f-8efc-483f-b734-fa854ddd30dc" (UID: "9b5b880f-8efc-483f-b734-fa854ddd30dc"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.747919 4696 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9b5b880f-8efc-483f-b734-fa854ddd30dc-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.749760 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b5b880f-8efc-483f-b734-fa854ddd30dc-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "9b5b880f-8efc-483f-b734-fa854ddd30dc" (UID: "9b5b880f-8efc-483f-b734-fa854ddd30dc"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.755379 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b5b880f-8efc-483f-b734-fa854ddd30dc-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "9b5b880f-8efc-483f-b734-fa854ddd30dc" (UID: "9b5b880f-8efc-483f-b734-fa854ddd30dc"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.757610 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "9b5b880f-8efc-483f-b734-fa854ddd30dc" (UID: "9b5b880f-8efc-483f-b734-fa854ddd30dc"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.758457 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b5b880f-8efc-483f-b734-fa854ddd30dc-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "9b5b880f-8efc-483f-b734-fa854ddd30dc" (UID: "9b5b880f-8efc-483f-b734-fa854ddd30dc"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.766954 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/9b5b880f-8efc-483f-b734-fa854ddd30dc-pod-info" (OuterVolumeSpecName: "pod-info") pod "9b5b880f-8efc-483f-b734-fa854ddd30dc" (UID: "9b5b880f-8efc-483f-b734-fa854ddd30dc"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.768510 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b5b880f-8efc-483f-b734-fa854ddd30dc-kube-api-access-jnjnk" (OuterVolumeSpecName: "kube-api-access-jnjnk") pod "9b5b880f-8efc-483f-b734-fa854ddd30dc" (UID: "9b5b880f-8efc-483f-b734-fa854ddd30dc"). InnerVolumeSpecName "kube-api-access-jnjnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.774597 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b5b880f-8efc-483f-b734-fa854ddd30dc-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "9b5b880f-8efc-483f-b734-fa854ddd30dc" (UID: "9b5b880f-8efc-483f-b734-fa854ddd30dc"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.805405 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b5b880f-8efc-483f-b734-fa854ddd30dc-config-data" (OuterVolumeSpecName: "config-data") pod "9b5b880f-8efc-483f-b734-fa854ddd30dc" (UID: "9b5b880f-8efc-483f-b734-fa854ddd30dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.833011 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b5b880f-8efc-483f-b734-fa854ddd30dc-server-conf" (OuterVolumeSpecName: "server-conf") pod "9b5b880f-8efc-483f-b734-fa854ddd30dc" (UID: "9b5b880f-8efc-483f-b734-fa854ddd30dc"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.851659 4696 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9b5b880f-8efc-483f-b734-fa854ddd30dc-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.851706 4696 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9b5b880f-8efc-483f-b734-fa854ddd30dc-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.851720 4696 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9b5b880f-8efc-483f-b734-fa854ddd30dc-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.851730 4696 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9b5b880f-8efc-483f-b734-fa854ddd30dc-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.851768 4696 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.851780 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b5b880f-8efc-483f-b734-fa854ddd30dc-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.851791 4696 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9b5b880f-8efc-483f-b734-fa854ddd30dc-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.851801 4696 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9b5b880f-8efc-483f-b734-fa854ddd30dc-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.851813 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnjnk\" (UniqueName: \"kubernetes.io/projected/9b5b880f-8efc-483f-b734-fa854ddd30dc-kube-api-access-jnjnk\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.879976 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b5b880f-8efc-483f-b734-fa854ddd30dc-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "9b5b880f-8efc-483f-b734-fa854ddd30dc" (UID: "9b5b880f-8efc-483f-b734-fa854ddd30dc"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.883150 4696 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.954305 4696 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:31 crc kubenswrapper[4696]: I0318 16:01:31.954348 4696 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9b5b880f-8efc-483f-b734-fa854ddd30dc-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.010357 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="0bdb4167-8754-4c20-97ea-b014ce2cafdc" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.463323 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9b5b880f-8efc-483f-b734-fa854ddd30dc","Type":"ContainerDied","Data":"619ff68b53517da3000215b05bbd62bf092feb025f5ae29d9886854eeb792fad"} Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.463679 4696 scope.go:117] "RemoveContainer" containerID="763a47e190b3766069fc26ef6310bdc5c5beb355c682d3dd8ee7ed57e35b1c03" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.463424 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.514262 4696 scope.go:117] "RemoveContainer" containerID="c0fd7f944f641aa39a971d00f949ab3c23bfbfb18ecce7eea052a2f01e079a00" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.520584 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.530129 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.580633 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 16:01:32 crc kubenswrapper[4696]: E0318 16:01:32.581314 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b5b880f-8efc-483f-b734-fa854ddd30dc" containerName="setup-container" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.581332 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b5b880f-8efc-483f-b734-fa854ddd30dc" containerName="setup-container" Mar 18 16:01:32 crc kubenswrapper[4696]: E0318 16:01:32.581350 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b5b880f-8efc-483f-b734-fa854ddd30dc" containerName="rabbitmq" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.581357 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b5b880f-8efc-483f-b734-fa854ddd30dc" containerName="rabbitmq" Mar 18 16:01:32 crc kubenswrapper[4696]: E0318 16:01:32.581371 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8aacc65-eb04-4cb3-8ab2-fb34b6769db4" containerName="keystone-cron" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.581379 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8aacc65-eb04-4cb3-8ab2-fb34b6769db4" containerName="keystone-cron" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.581885 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8aacc65-eb04-4cb3-8ab2-fb34b6769db4" containerName="keystone-cron" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.581925 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b5b880f-8efc-483f-b734-fa854ddd30dc" containerName="rabbitmq" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.583220 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.586806 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.587000 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.587126 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.587236 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ks66j" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.587009 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.587353 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.587419 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.599041 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.764594 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.764712 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8db68e71-4312-400b-8575-06f87bf6a781-config-data\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.764734 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8db68e71-4312-400b-8575-06f87bf6a781-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.764778 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8db68e71-4312-400b-8575-06f87bf6a781-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.764799 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8db68e71-4312-400b-8575-06f87bf6a781-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.769669 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8db68e71-4312-400b-8575-06f87bf6a781-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.769754 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8db68e71-4312-400b-8575-06f87bf6a781-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.769878 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8db68e71-4312-400b-8575-06f87bf6a781-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.769930 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8db68e71-4312-400b-8575-06f87bf6a781-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.769987 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8db68e71-4312-400b-8575-06f87bf6a781-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.770051 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfrwz\" (UniqueName: \"kubernetes.io/projected/8db68e71-4312-400b-8575-06f87bf6a781-kube-api-access-vfrwz\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.871897 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.872237 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.886954 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8db68e71-4312-400b-8575-06f87bf6a781-config-data\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.887044 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8db68e71-4312-400b-8575-06f87bf6a781-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.887092 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8db68e71-4312-400b-8575-06f87bf6a781-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.887108 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8db68e71-4312-400b-8575-06f87bf6a781-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.887262 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8db68e71-4312-400b-8575-06f87bf6a781-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.887699 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8db68e71-4312-400b-8575-06f87bf6a781-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.887985 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8db68e71-4312-400b-8575-06f87bf6a781-config-data\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.887961 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8db68e71-4312-400b-8575-06f87bf6a781-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.888374 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8db68e71-4312-400b-8575-06f87bf6a781-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.888636 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8db68e71-4312-400b-8575-06f87bf6a781-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.889044 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfrwz\" (UniqueName: \"kubernetes.io/projected/8db68e71-4312-400b-8575-06f87bf6a781-kube-api-access-vfrwz\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.891915 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8db68e71-4312-400b-8575-06f87bf6a781-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.892168 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8db68e71-4312-400b-8575-06f87bf6a781-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.892242 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8db68e71-4312-400b-8575-06f87bf6a781-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.892384 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8db68e71-4312-400b-8575-06f87bf6a781-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.892881 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8db68e71-4312-400b-8575-06f87bf6a781-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.894004 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8db68e71-4312-400b-8575-06f87bf6a781-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.894644 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8db68e71-4312-400b-8575-06f87bf6a781-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.905217 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8db68e71-4312-400b-8575-06f87bf6a781-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.914232 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfrwz\" (UniqueName: \"kubernetes.io/projected/8db68e71-4312-400b-8575-06f87bf6a781-kube-api-access-vfrwz\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:32 crc kubenswrapper[4696]: I0318 16:01:32.925882 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"8db68e71-4312-400b-8575-06f87bf6a781\") " pod="openstack/rabbitmq-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.000387 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.015279 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.092929 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0bdb4167-8754-4c20-97ea-b014ce2cafdc-rabbitmq-erlang-cookie\") pod \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.093020 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0bdb4167-8754-4c20-97ea-b014ce2cafdc-pod-info\") pod \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.093052 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff97b\" (UniqueName: \"kubernetes.io/projected/0bdb4167-8754-4c20-97ea-b014ce2cafdc-kube-api-access-ff97b\") pod \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.093082 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0bdb4167-8754-4c20-97ea-b014ce2cafdc-server-conf\") pod \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.093138 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0bdb4167-8754-4c20-97ea-b014ce2cafdc-rabbitmq-confd\") pod \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.093161 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0bdb4167-8754-4c20-97ea-b014ce2cafdc-plugins-conf\") pod \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.093290 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0bdb4167-8754-4c20-97ea-b014ce2cafdc-erlang-cookie-secret\") pod \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.093349 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0bdb4167-8754-4c20-97ea-b014ce2cafdc-config-data\") pod \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.093381 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0bdb4167-8754-4c20-97ea-b014ce2cafdc-rabbitmq-plugins\") pod \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.093415 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0bdb4167-8754-4c20-97ea-b014ce2cafdc-rabbitmq-tls\") pod \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.093449 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\" (UID: \"0bdb4167-8754-4c20-97ea-b014ce2cafdc\") " Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.096569 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bdb4167-8754-4c20-97ea-b014ce2cafdc-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0bdb4167-8754-4c20-97ea-b014ce2cafdc" (UID: "0bdb4167-8754-4c20-97ea-b014ce2cafdc"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.097148 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bdb4167-8754-4c20-97ea-b014ce2cafdc-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0bdb4167-8754-4c20-97ea-b014ce2cafdc" (UID: "0bdb4167-8754-4c20-97ea-b014ce2cafdc"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.102330 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bdb4167-8754-4c20-97ea-b014ce2cafdc-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0bdb4167-8754-4c20-97ea-b014ce2cafdc" (UID: "0bdb4167-8754-4c20-97ea-b014ce2cafdc"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.102472 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "0bdb4167-8754-4c20-97ea-b014ce2cafdc" (UID: "0bdb4167-8754-4c20-97ea-b014ce2cafdc"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.102836 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bdb4167-8754-4c20-97ea-b014ce2cafdc-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0bdb4167-8754-4c20-97ea-b014ce2cafdc" (UID: "0bdb4167-8754-4c20-97ea-b014ce2cafdc"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.104370 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bdb4167-8754-4c20-97ea-b014ce2cafdc-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0bdb4167-8754-4c20-97ea-b014ce2cafdc" (UID: "0bdb4167-8754-4c20-97ea-b014ce2cafdc"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.115772 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bdb4167-8754-4c20-97ea-b014ce2cafdc-kube-api-access-ff97b" (OuterVolumeSpecName: "kube-api-access-ff97b") pod "0bdb4167-8754-4c20-97ea-b014ce2cafdc" (UID: "0bdb4167-8754-4c20-97ea-b014ce2cafdc"). InnerVolumeSpecName "kube-api-access-ff97b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.129919 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0bdb4167-8754-4c20-97ea-b014ce2cafdc-pod-info" (OuterVolumeSpecName: "pod-info") pod "0bdb4167-8754-4c20-97ea-b014ce2cafdc" (UID: "0bdb4167-8754-4c20-97ea-b014ce2cafdc"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.140893 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bdb4167-8754-4c20-97ea-b014ce2cafdc-config-data" (OuterVolumeSpecName: "config-data") pod "0bdb4167-8754-4c20-97ea-b014ce2cafdc" (UID: "0bdb4167-8754-4c20-97ea-b014ce2cafdc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.196191 4696 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0bdb4167-8754-4c20-97ea-b014ce2cafdc-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.196576 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff97b\" (UniqueName: \"kubernetes.io/projected/0bdb4167-8754-4c20-97ea-b014ce2cafdc-kube-api-access-ff97b\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.196595 4696 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0bdb4167-8754-4c20-97ea-b014ce2cafdc-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.196607 4696 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0bdb4167-8754-4c20-97ea-b014ce2cafdc-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.196644 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0bdb4167-8754-4c20-97ea-b014ce2cafdc-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.196658 4696 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0bdb4167-8754-4c20-97ea-b014ce2cafdc-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.196667 4696 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0bdb4167-8754-4c20-97ea-b014ce2cafdc-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.196696 4696 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.196732 4696 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0bdb4167-8754-4c20-97ea-b014ce2cafdc-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.222900 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bdb4167-8754-4c20-97ea-b014ce2cafdc-server-conf" (OuterVolumeSpecName: "server-conf") pod "0bdb4167-8754-4c20-97ea-b014ce2cafdc" (UID: "0bdb4167-8754-4c20-97ea-b014ce2cafdc"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.239551 4696 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.291080 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bdb4167-8754-4c20-97ea-b014ce2cafdc-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0bdb4167-8754-4c20-97ea-b014ce2cafdc" (UID: "0bdb4167-8754-4c20-97ea-b014ce2cafdc"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.299180 4696 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0bdb4167-8754-4c20-97ea-b014ce2cafdc-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.299232 4696 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0bdb4167-8754-4c20-97ea-b014ce2cafdc-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.299245 4696 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.478178 4696 generic.go:334] "Generic (PLEG): container finished" podID="0bdb4167-8754-4c20-97ea-b014ce2cafdc" containerID="12649b52449583f0f3026135104b5af446bca7a18e2d1e387356eb235a32807b" exitCode=0 Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.478263 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0bdb4167-8754-4c20-97ea-b014ce2cafdc","Type":"ContainerDied","Data":"12649b52449583f0f3026135104b5af446bca7a18e2d1e387356eb235a32807b"} Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.478306 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0bdb4167-8754-4c20-97ea-b014ce2cafdc","Type":"ContainerDied","Data":"c137c4eb0e12db16624375acf1c743ab9ee4f246665956ce7b75edd7c223a66c"} Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.478333 4696 scope.go:117] "RemoveContainer" containerID="12649b52449583f0f3026135104b5af446bca7a18e2d1e387356eb235a32807b" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.478387 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.521506 4696 scope.go:117] "RemoveContainer" containerID="15558694b84554c2673508963769937e6f6ef6935177a9200417d184a7df8a9d" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.521696 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.546104 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.565431 4696 scope.go:117] "RemoveContainer" containerID="12649b52449583f0f3026135104b5af446bca7a18e2d1e387356eb235a32807b" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.570172 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 16:01:33 crc kubenswrapper[4696]: E0318 16:01:33.570850 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bdb4167-8754-4c20-97ea-b014ce2cafdc" containerName="setup-container" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.570870 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bdb4167-8754-4c20-97ea-b014ce2cafdc" containerName="setup-container" Mar 18 16:01:33 crc kubenswrapper[4696]: E0318 16:01:33.570889 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bdb4167-8754-4c20-97ea-b014ce2cafdc" containerName="rabbitmq" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.570897 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bdb4167-8754-4c20-97ea-b014ce2cafdc" containerName="rabbitmq" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.571161 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bdb4167-8754-4c20-97ea-b014ce2cafdc" containerName="rabbitmq" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.572486 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.579223 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mvprf" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.579548 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.579789 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.580002 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.580164 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.580316 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.580470 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 16:01:33 crc kubenswrapper[4696]: E0318 16:01:33.581066 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12649b52449583f0f3026135104b5af446bca7a18e2d1e387356eb235a32807b\": container with ID starting with 12649b52449583f0f3026135104b5af446bca7a18e2d1e387356eb235a32807b not found: ID does not exist" containerID="12649b52449583f0f3026135104b5af446bca7a18e2d1e387356eb235a32807b" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.581112 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12649b52449583f0f3026135104b5af446bca7a18e2d1e387356eb235a32807b"} err="failed to get container status \"12649b52449583f0f3026135104b5af446bca7a18e2d1e387356eb235a32807b\": rpc error: code = NotFound desc = could not find container \"12649b52449583f0f3026135104b5af446bca7a18e2d1e387356eb235a32807b\": container with ID starting with 12649b52449583f0f3026135104b5af446bca7a18e2d1e387356eb235a32807b not found: ID does not exist" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.581147 4696 scope.go:117] "RemoveContainer" containerID="15558694b84554c2673508963769937e6f6ef6935177a9200417d184a7df8a9d" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.587415 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 16:01:33 crc kubenswrapper[4696]: E0318 16:01:33.589110 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15558694b84554c2673508963769937e6f6ef6935177a9200417d184a7df8a9d\": container with ID starting with 15558694b84554c2673508963769937e6f6ef6935177a9200417d184a7df8a9d not found: ID does not exist" containerID="15558694b84554c2673508963769937e6f6ef6935177a9200417d184a7df8a9d" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.589169 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15558694b84554c2673508963769937e6f6ef6935177a9200417d184a7df8a9d"} err="failed to get container status \"15558694b84554c2673508963769937e6f6ef6935177a9200417d184a7df8a9d\": rpc error: code = NotFound desc = could not find container \"15558694b84554c2673508963769937e6f6ef6935177a9200417d184a7df8a9d\": container with ID starting with 15558694b84554c2673508963769937e6f6ef6935177a9200417d184a7df8a9d not found: ID does not exist" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.625224 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bdb4167-8754-4c20-97ea-b014ce2cafdc" path="/var/lib/kubelet/pods/0bdb4167-8754-4c20-97ea-b014ce2cafdc/volumes" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.626588 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b5b880f-8efc-483f-b734-fa854ddd30dc" path="/var/lib/kubelet/pods/9b5b880f-8efc-483f-b734-fa854ddd30dc/volumes" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.648132 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.707471 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad207e86-aeb6-4af2-a411-dee8342b4fe9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.707555 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad207e86-aeb6-4af2-a411-dee8342b4fe9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.707601 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad207e86-aeb6-4af2-a411-dee8342b4fe9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.707632 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad207e86-aeb6-4af2-a411-dee8342b4fe9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.707648 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad207e86-aeb6-4af2-a411-dee8342b4fe9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.707739 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsvgv\" (UniqueName: \"kubernetes.io/projected/ad207e86-aeb6-4af2-a411-dee8342b4fe9-kube-api-access-xsvgv\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.707775 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad207e86-aeb6-4af2-a411-dee8342b4fe9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.707796 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.707813 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad207e86-aeb6-4af2-a411-dee8342b4fe9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.707856 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad207e86-aeb6-4af2-a411-dee8342b4fe9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.707885 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad207e86-aeb6-4af2-a411-dee8342b4fe9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.809484 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad207e86-aeb6-4af2-a411-dee8342b4fe9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.809588 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad207e86-aeb6-4af2-a411-dee8342b4fe9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.809627 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad207e86-aeb6-4af2-a411-dee8342b4fe9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.809662 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad207e86-aeb6-4af2-a411-dee8342b4fe9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.809709 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsvgv\" (UniqueName: \"kubernetes.io/projected/ad207e86-aeb6-4af2-a411-dee8342b4fe9-kube-api-access-xsvgv\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.809756 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad207e86-aeb6-4af2-a411-dee8342b4fe9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.809788 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.809814 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad207e86-aeb6-4af2-a411-dee8342b4fe9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.809878 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad207e86-aeb6-4af2-a411-dee8342b4fe9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.809915 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad207e86-aeb6-4af2-a411-dee8342b4fe9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.809947 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad207e86-aeb6-4af2-a411-dee8342b4fe9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.810250 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad207e86-aeb6-4af2-a411-dee8342b4fe9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.810604 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad207e86-aeb6-4af2-a411-dee8342b4fe9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.810685 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.810966 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad207e86-aeb6-4af2-a411-dee8342b4fe9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.811312 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad207e86-aeb6-4af2-a411-dee8342b4fe9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.811872 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad207e86-aeb6-4af2-a411-dee8342b4fe9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.814456 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad207e86-aeb6-4af2-a411-dee8342b4fe9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.814723 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad207e86-aeb6-4af2-a411-dee8342b4fe9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.815123 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad207e86-aeb6-4af2-a411-dee8342b4fe9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.815488 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad207e86-aeb6-4af2-a411-dee8342b4fe9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.829420 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsvgv\" (UniqueName: \"kubernetes.io/projected/ad207e86-aeb6-4af2-a411-dee8342b4fe9-kube-api-access-xsvgv\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.845611 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad207e86-aeb6-4af2-a411-dee8342b4fe9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:33 crc kubenswrapper[4696]: I0318 16:01:33.924628 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.031299 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d558885bc-8x7l8"] Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.033375 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.035558 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.068981 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-8x7l8"] Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.218437 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-dns-svc\") pod \"dnsmasq-dns-d558885bc-8x7l8\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.218588 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-config\") pod \"dnsmasq-dns-d558885bc-8x7l8\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.218667 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-8x7l8\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.218851 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-8x7l8\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.219213 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc96w\" (UniqueName: \"kubernetes.io/projected/9aff3283-1485-404b-b76c-1516a700c101-kube-api-access-pc96w\") pod \"dnsmasq-dns-d558885bc-8x7l8\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.219260 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-8x7l8\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.219297 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-8x7l8\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.321405 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-config\") pod \"dnsmasq-dns-d558885bc-8x7l8\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.321489 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-8x7l8\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.321545 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-8x7l8\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.321645 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc96w\" (UniqueName: \"kubernetes.io/projected/9aff3283-1485-404b-b76c-1516a700c101-kube-api-access-pc96w\") pod \"dnsmasq-dns-d558885bc-8x7l8\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.321672 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-8x7l8\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.321692 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-8x7l8\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.321718 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-dns-svc\") pod \"dnsmasq-dns-d558885bc-8x7l8\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.323277 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-8x7l8\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.323366 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-config\") pod \"dnsmasq-dns-d558885bc-8x7l8\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.323415 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-8x7l8\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.324056 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-dns-svc\") pod \"dnsmasq-dns-d558885bc-8x7l8\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.324635 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-8x7l8\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.326818 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-8x7l8\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.340298 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc96w\" (UniqueName: \"kubernetes.io/projected/9aff3283-1485-404b-b76c-1516a700c101-kube-api-access-pc96w\") pod \"dnsmasq-dns-d558885bc-8x7l8\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.371355 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.471154 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 16:01:34 crc kubenswrapper[4696]: I0318 16:01:34.517674 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db68e71-4312-400b-8575-06f87bf6a781","Type":"ContainerStarted","Data":"8880fe108d31d9e6225d7675dee32b9a290b26b49ebaf4b7135cad152a3aa259"} Mar 18 16:01:35 crc kubenswrapper[4696]: W0318 16:01:35.008179 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9aff3283_1485_404b_b76c_1516a700c101.slice/crio-10065098164a73b463774b2bdef2da901e44dc2cb152caeca6f8a17248b799a6 WatchSource:0}: Error finding container 10065098164a73b463774b2bdef2da901e44dc2cb152caeca6f8a17248b799a6: Status 404 returned error can't find the container with id 10065098164a73b463774b2bdef2da901e44dc2cb152caeca6f8a17248b799a6 Mar 18 16:01:35 crc kubenswrapper[4696]: I0318 16:01:35.014410 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-8x7l8"] Mar 18 16:01:35 crc kubenswrapper[4696]: I0318 16:01:35.540628 4696 generic.go:334] "Generic (PLEG): container finished" podID="9aff3283-1485-404b-b76c-1516a700c101" containerID="e4c0cbf71ef1d796f0389037930df0197aeff8ecaa36caf07fa1f84bac0508fb" exitCode=0 Mar 18 16:01:35 crc kubenswrapper[4696]: I0318 16:01:35.540716 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-8x7l8" event={"ID":"9aff3283-1485-404b-b76c-1516a700c101","Type":"ContainerDied","Data":"e4c0cbf71ef1d796f0389037930df0197aeff8ecaa36caf07fa1f84bac0508fb"} Mar 18 16:01:35 crc kubenswrapper[4696]: I0318 16:01:35.540744 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-8x7l8" event={"ID":"9aff3283-1485-404b-b76c-1516a700c101","Type":"ContainerStarted","Data":"10065098164a73b463774b2bdef2da901e44dc2cb152caeca6f8a17248b799a6"} Mar 18 16:01:35 crc kubenswrapper[4696]: I0318 16:01:35.542162 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ad207e86-aeb6-4af2-a411-dee8342b4fe9","Type":"ContainerStarted","Data":"140270d9fc10d8bc1cf721d0027f659a65e825ca3beb67a63abc2024c17fa42d"} Mar 18 16:01:35 crc kubenswrapper[4696]: I0318 16:01:35.543415 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db68e71-4312-400b-8575-06f87bf6a781","Type":"ContainerStarted","Data":"1cbea0ec22e01c5967e960d519ceed7f8598affad7b95b6650312f68daedd58d"} Mar 18 16:01:36 crc kubenswrapper[4696]: I0318 16:01:36.563422 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ad207e86-aeb6-4af2-a411-dee8342b4fe9","Type":"ContainerStarted","Data":"1a7ddc0ef8202373729a442c07e394694075bb0e854e50c02deb6690ae0d2647"} Mar 18 16:01:36 crc kubenswrapper[4696]: I0318 16:01:36.566754 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-8x7l8" event={"ID":"9aff3283-1485-404b-b76c-1516a700c101","Type":"ContainerStarted","Data":"d46bb19d3e0860b39bb61aa84b6921c9553d7bb4e8e0afc7bf78274c199b1e67"} Mar 18 16:01:37 crc kubenswrapper[4696]: I0318 16:01:37.575699 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.372787 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.415990 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d558885bc-8x7l8" podStartSLOduration=10.415961211 podStartE2EDuration="10.415961211s" podCreationTimestamp="2026-03-18 16:01:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:36.619017135 +0000 UTC m=+1539.625191351" watchObservedRunningTime="2026-03-18 16:01:44.415961211 +0000 UTC m=+1547.422135427" Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.456933 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-jxj45"] Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.457352 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" podUID="a024523d-d753-4a81-a8e5-2b416559d14f" containerName="dnsmasq-dns" containerID="cri-o://901019a8a0d3f4547c51fc68c642baf6be7145eca0abb34460bc2d40619fb0b6" gracePeriod=10 Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.685367 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-kff5g"] Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.687756 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.707497 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-kff5g"] Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.757398 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq269\" (UniqueName: \"kubernetes.io/projected/b04fc1e7-0f41-46df-90ac-71d0b7d4e29d-kube-api-access-pq269\") pod \"dnsmasq-dns-78c64bc9c5-kff5g\" (UID: \"b04fc1e7-0f41-46df-90ac-71d0b7d4e29d\") " pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.757491 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b04fc1e7-0f41-46df-90ac-71d0b7d4e29d-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-kff5g\" (UID: \"b04fc1e7-0f41-46df-90ac-71d0b7d4e29d\") " pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.757549 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b04fc1e7-0f41-46df-90ac-71d0b7d4e29d-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-kff5g\" (UID: \"b04fc1e7-0f41-46df-90ac-71d0b7d4e29d\") " pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.757579 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b04fc1e7-0f41-46df-90ac-71d0b7d4e29d-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-kff5g\" (UID: \"b04fc1e7-0f41-46df-90ac-71d0b7d4e29d\") " pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.757657 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b04fc1e7-0f41-46df-90ac-71d0b7d4e29d-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-kff5g\" (UID: \"b04fc1e7-0f41-46df-90ac-71d0b7d4e29d\") " pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.757791 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b04fc1e7-0f41-46df-90ac-71d0b7d4e29d-config\") pod \"dnsmasq-dns-78c64bc9c5-kff5g\" (UID: \"b04fc1e7-0f41-46df-90ac-71d0b7d4e29d\") " pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.757860 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b04fc1e7-0f41-46df-90ac-71d0b7d4e29d-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-kff5g\" (UID: \"b04fc1e7-0f41-46df-90ac-71d0b7d4e29d\") " pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.859589 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b04fc1e7-0f41-46df-90ac-71d0b7d4e29d-config\") pod \"dnsmasq-dns-78c64bc9c5-kff5g\" (UID: \"b04fc1e7-0f41-46df-90ac-71d0b7d4e29d\") " pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.859670 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b04fc1e7-0f41-46df-90ac-71d0b7d4e29d-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-kff5g\" (UID: \"b04fc1e7-0f41-46df-90ac-71d0b7d4e29d\") " pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.859707 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq269\" (UniqueName: \"kubernetes.io/projected/b04fc1e7-0f41-46df-90ac-71d0b7d4e29d-kube-api-access-pq269\") pod \"dnsmasq-dns-78c64bc9c5-kff5g\" (UID: \"b04fc1e7-0f41-46df-90ac-71d0b7d4e29d\") " pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.859733 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b04fc1e7-0f41-46df-90ac-71d0b7d4e29d-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-kff5g\" (UID: \"b04fc1e7-0f41-46df-90ac-71d0b7d4e29d\") " pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.859762 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b04fc1e7-0f41-46df-90ac-71d0b7d4e29d-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-kff5g\" (UID: \"b04fc1e7-0f41-46df-90ac-71d0b7d4e29d\") " pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.859781 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b04fc1e7-0f41-46df-90ac-71d0b7d4e29d-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-kff5g\" (UID: \"b04fc1e7-0f41-46df-90ac-71d0b7d4e29d\") " pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.859817 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b04fc1e7-0f41-46df-90ac-71d0b7d4e29d-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-kff5g\" (UID: \"b04fc1e7-0f41-46df-90ac-71d0b7d4e29d\") " pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.860766 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b04fc1e7-0f41-46df-90ac-71d0b7d4e29d-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-kff5g\" (UID: \"b04fc1e7-0f41-46df-90ac-71d0b7d4e29d\") " pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.861147 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b04fc1e7-0f41-46df-90ac-71d0b7d4e29d-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-kff5g\" (UID: \"b04fc1e7-0f41-46df-90ac-71d0b7d4e29d\") " pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.861196 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b04fc1e7-0f41-46df-90ac-71d0b7d4e29d-config\") pod \"dnsmasq-dns-78c64bc9c5-kff5g\" (UID: \"b04fc1e7-0f41-46df-90ac-71d0b7d4e29d\") " pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.864170 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b04fc1e7-0f41-46df-90ac-71d0b7d4e29d-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-kff5g\" (UID: \"b04fc1e7-0f41-46df-90ac-71d0b7d4e29d\") " pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.865151 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b04fc1e7-0f41-46df-90ac-71d0b7d4e29d-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-kff5g\" (UID: \"b04fc1e7-0f41-46df-90ac-71d0b7d4e29d\") " pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.865855 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b04fc1e7-0f41-46df-90ac-71d0b7d4e29d-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-kff5g\" (UID: \"b04fc1e7-0f41-46df-90ac-71d0b7d4e29d\") " pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:44 crc kubenswrapper[4696]: I0318 16:01:44.891072 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq269\" (UniqueName: \"kubernetes.io/projected/b04fc1e7-0f41-46df-90ac-71d0b7d4e29d-kube-api-access-pq269\") pod \"dnsmasq-dns-78c64bc9c5-kff5g\" (UID: \"b04fc1e7-0f41-46df-90ac-71d0b7d4e29d\") " pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.017332 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" podUID="a024523d-d753-4a81-a8e5-2b416559d14f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.207:5353: connect: connection refused" Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.053549 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.458184 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.572234 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-ovsdbserver-nb\") pod \"a024523d-d753-4a81-a8e5-2b416559d14f\" (UID: \"a024523d-d753-4a81-a8e5-2b416559d14f\") " Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.572542 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-dns-svc\") pod \"a024523d-d753-4a81-a8e5-2b416559d14f\" (UID: \"a024523d-d753-4a81-a8e5-2b416559d14f\") " Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.572597 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-ovsdbserver-sb\") pod \"a024523d-d753-4a81-a8e5-2b416559d14f\" (UID: \"a024523d-d753-4a81-a8e5-2b416559d14f\") " Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.572624 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-dns-swift-storage-0\") pod \"a024523d-d753-4a81-a8e5-2b416559d14f\" (UID: \"a024523d-d753-4a81-a8e5-2b416559d14f\") " Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.572825 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-config\") pod \"a024523d-d753-4a81-a8e5-2b416559d14f\" (UID: \"a024523d-d753-4a81-a8e5-2b416559d14f\") " Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.572840 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq28k\" (UniqueName: \"kubernetes.io/projected/a024523d-d753-4a81-a8e5-2b416559d14f-kube-api-access-pq28k\") pod \"a024523d-d753-4a81-a8e5-2b416559d14f\" (UID: \"a024523d-d753-4a81-a8e5-2b416559d14f\") " Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.582797 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a024523d-d753-4a81-a8e5-2b416559d14f-kube-api-access-pq28k" (OuterVolumeSpecName: "kube-api-access-pq28k") pod "a024523d-d753-4a81-a8e5-2b416559d14f" (UID: "a024523d-d753-4a81-a8e5-2b416559d14f"). InnerVolumeSpecName "kube-api-access-pq28k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.595295 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-kff5g"] Mar 18 16:01:45 crc kubenswrapper[4696]: W0318 16:01:45.598703 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb04fc1e7_0f41_46df_90ac_71d0b7d4e29d.slice/crio-77e2dd631b0c9f5d22222112f8bc537ede4502df5781a6432fcabfff15bf93ca WatchSource:0}: Error finding container 77e2dd631b0c9f5d22222112f8bc537ede4502df5781a6432fcabfff15bf93ca: Status 404 returned error can't find the container with id 77e2dd631b0c9f5d22222112f8bc537ede4502df5781a6432fcabfff15bf93ca Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.644486 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a024523d-d753-4a81-a8e5-2b416559d14f" (UID: "a024523d-d753-4a81-a8e5-2b416559d14f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.653343 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-config" (OuterVolumeSpecName: "config") pod "a024523d-d753-4a81-a8e5-2b416559d14f" (UID: "a024523d-d753-4a81-a8e5-2b416559d14f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.660139 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a024523d-d753-4a81-a8e5-2b416559d14f" (UID: "a024523d-d753-4a81-a8e5-2b416559d14f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.667303 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a024523d-d753-4a81-a8e5-2b416559d14f" (UID: "a024523d-d753-4a81-a8e5-2b416559d14f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.667923 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a024523d-d753-4a81-a8e5-2b416559d14f" (UID: "a024523d-d753-4a81-a8e5-2b416559d14f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.677453 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.677492 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq28k\" (UniqueName: \"kubernetes.io/projected/a024523d-d753-4a81-a8e5-2b416559d14f-kube-api-access-pq28k\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.677507 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.677590 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.677603 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.677613 4696 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a024523d-d753-4a81-a8e5-2b416559d14f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.696847 4696 generic.go:334] "Generic (PLEG): container finished" podID="a024523d-d753-4a81-a8e5-2b416559d14f" containerID="901019a8a0d3f4547c51fc68c642baf6be7145eca0abb34460bc2d40619fb0b6" exitCode=0 Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.697016 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.806039 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" event={"ID":"a024523d-d753-4a81-a8e5-2b416559d14f","Type":"ContainerDied","Data":"901019a8a0d3f4547c51fc68c642baf6be7145eca0abb34460bc2d40619fb0b6"} Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.806103 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-jxj45" event={"ID":"a024523d-d753-4a81-a8e5-2b416559d14f","Type":"ContainerDied","Data":"bf76b2109dddc178c34afe1cd4c7304a087d90d672ced243f59bcabfccf1070f"} Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.806121 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" event={"ID":"b04fc1e7-0f41-46df-90ac-71d0b7d4e29d","Type":"ContainerStarted","Data":"77e2dd631b0c9f5d22222112f8bc537ede4502df5781a6432fcabfff15bf93ca"} Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.806144 4696 scope.go:117] "RemoveContainer" containerID="901019a8a0d3f4547c51fc68c642baf6be7145eca0abb34460bc2d40619fb0b6" Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.858218 4696 scope.go:117] "RemoveContainer" containerID="f72d9cbecb9c09ac23c1cb02a92aa449ab1d0d3d3b2069c25d80ad1f34364e22" Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.874203 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-jxj45"] Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.882064 4696 scope.go:117] "RemoveContainer" containerID="901019a8a0d3f4547c51fc68c642baf6be7145eca0abb34460bc2d40619fb0b6" Mar 18 16:01:45 crc kubenswrapper[4696]: E0318 16:01:45.882564 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"901019a8a0d3f4547c51fc68c642baf6be7145eca0abb34460bc2d40619fb0b6\": container with ID starting with 901019a8a0d3f4547c51fc68c642baf6be7145eca0abb34460bc2d40619fb0b6 not found: ID does not exist" containerID="901019a8a0d3f4547c51fc68c642baf6be7145eca0abb34460bc2d40619fb0b6" Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.882612 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"901019a8a0d3f4547c51fc68c642baf6be7145eca0abb34460bc2d40619fb0b6"} err="failed to get container status \"901019a8a0d3f4547c51fc68c642baf6be7145eca0abb34460bc2d40619fb0b6\": rpc error: code = NotFound desc = could not find container \"901019a8a0d3f4547c51fc68c642baf6be7145eca0abb34460bc2d40619fb0b6\": container with ID starting with 901019a8a0d3f4547c51fc68c642baf6be7145eca0abb34460bc2d40619fb0b6 not found: ID does not exist" Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.882651 4696 scope.go:117] "RemoveContainer" containerID="f72d9cbecb9c09ac23c1cb02a92aa449ab1d0d3d3b2069c25d80ad1f34364e22" Mar 18 16:01:45 crc kubenswrapper[4696]: E0318 16:01:45.882964 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f72d9cbecb9c09ac23c1cb02a92aa449ab1d0d3d3b2069c25d80ad1f34364e22\": container with ID starting with f72d9cbecb9c09ac23c1cb02a92aa449ab1d0d3d3b2069c25d80ad1f34364e22 not found: ID does not exist" containerID="f72d9cbecb9c09ac23c1cb02a92aa449ab1d0d3d3b2069c25d80ad1f34364e22" Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.882994 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f72d9cbecb9c09ac23c1cb02a92aa449ab1d0d3d3b2069c25d80ad1f34364e22"} err="failed to get container status \"f72d9cbecb9c09ac23c1cb02a92aa449ab1d0d3d3b2069c25d80ad1f34364e22\": rpc error: code = NotFound desc = could not find container \"f72d9cbecb9c09ac23c1cb02a92aa449ab1d0d3d3b2069c25d80ad1f34364e22\": container with ID starting with f72d9cbecb9c09ac23c1cb02a92aa449ab1d0d3d3b2069c25d80ad1f34364e22 not found: ID does not exist" Mar 18 16:01:45 crc kubenswrapper[4696]: I0318 16:01:45.887053 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-jxj45"] Mar 18 16:01:46 crc kubenswrapper[4696]: I0318 16:01:46.721346 4696 generic.go:334] "Generic (PLEG): container finished" podID="b04fc1e7-0f41-46df-90ac-71d0b7d4e29d" containerID="9d30f5e6e27bb776174e0961062a9e27533b37d5b7e786a0db45bf7480c2a35b" exitCode=0 Mar 18 16:01:46 crc kubenswrapper[4696]: I0318 16:01:46.721410 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" event={"ID":"b04fc1e7-0f41-46df-90ac-71d0b7d4e29d","Type":"ContainerDied","Data":"9d30f5e6e27bb776174e0961062a9e27533b37d5b7e786a0db45bf7480c2a35b"} Mar 18 16:01:47 crc kubenswrapper[4696]: I0318 16:01:47.609572 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a024523d-d753-4a81-a8e5-2b416559d14f" path="/var/lib/kubelet/pods/a024523d-d753-4a81-a8e5-2b416559d14f/volumes" Mar 18 16:01:47 crc kubenswrapper[4696]: I0318 16:01:47.734763 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" event={"ID":"b04fc1e7-0f41-46df-90ac-71d0b7d4e29d","Type":"ContainerStarted","Data":"518add2608a7c7123cfe57e6899651fc2a6c143a423904e1af4c4ef7bc2192a4"} Mar 18 16:01:47 crc kubenswrapper[4696]: I0318 16:01:47.734908 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:47 crc kubenswrapper[4696]: I0318 16:01:47.760286 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" podStartSLOduration=3.76026241 podStartE2EDuration="3.76026241s" podCreationTimestamp="2026-03-18 16:01:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:47.758346361 +0000 UTC m=+1550.764520567" watchObservedRunningTime="2026-03-18 16:01:47.76026241 +0000 UTC m=+1550.766436626" Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.056177 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78c64bc9c5-kff5g" Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.143668 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-8x7l8"] Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.144134 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d558885bc-8x7l8" podUID="9aff3283-1485-404b-b76c-1516a700c101" containerName="dnsmasq-dns" containerID="cri-o://d46bb19d3e0860b39bb61aa84b6921c9553d7bb4e8e0afc7bf78274c199b1e67" gracePeriod=10 Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.773027 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.830225 4696 generic.go:334] "Generic (PLEG): container finished" podID="9aff3283-1485-404b-b76c-1516a700c101" containerID="d46bb19d3e0860b39bb61aa84b6921c9553d7bb4e8e0afc7bf78274c199b1e67" exitCode=0 Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.830287 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-8x7l8" event={"ID":"9aff3283-1485-404b-b76c-1516a700c101","Type":"ContainerDied","Data":"d46bb19d3e0860b39bb61aa84b6921c9553d7bb4e8e0afc7bf78274c199b1e67"} Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.830352 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-8x7l8" event={"ID":"9aff3283-1485-404b-b76c-1516a700c101","Type":"ContainerDied","Data":"10065098164a73b463774b2bdef2da901e44dc2cb152caeca6f8a17248b799a6"} Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.830376 4696 scope.go:117] "RemoveContainer" containerID="d46bb19d3e0860b39bb61aa84b6921c9553d7bb4e8e0afc7bf78274c199b1e67" Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.830597 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-8x7l8" Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.859235 4696 scope.go:117] "RemoveContainer" containerID="e4c0cbf71ef1d796f0389037930df0197aeff8ecaa36caf07fa1f84bac0508fb" Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.884841 4696 scope.go:117] "RemoveContainer" containerID="d46bb19d3e0860b39bb61aa84b6921c9553d7bb4e8e0afc7bf78274c199b1e67" Mar 18 16:01:55 crc kubenswrapper[4696]: E0318 16:01:55.885442 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d46bb19d3e0860b39bb61aa84b6921c9553d7bb4e8e0afc7bf78274c199b1e67\": container with ID starting with d46bb19d3e0860b39bb61aa84b6921c9553d7bb4e8e0afc7bf78274c199b1e67 not found: ID does not exist" containerID="d46bb19d3e0860b39bb61aa84b6921c9553d7bb4e8e0afc7bf78274c199b1e67" Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.885502 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d46bb19d3e0860b39bb61aa84b6921c9553d7bb4e8e0afc7bf78274c199b1e67"} err="failed to get container status \"d46bb19d3e0860b39bb61aa84b6921c9553d7bb4e8e0afc7bf78274c199b1e67\": rpc error: code = NotFound desc = could not find container \"d46bb19d3e0860b39bb61aa84b6921c9553d7bb4e8e0afc7bf78274c199b1e67\": container with ID starting with d46bb19d3e0860b39bb61aa84b6921c9553d7bb4e8e0afc7bf78274c199b1e67 not found: ID does not exist" Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.885566 4696 scope.go:117] "RemoveContainer" containerID="e4c0cbf71ef1d796f0389037930df0197aeff8ecaa36caf07fa1f84bac0508fb" Mar 18 16:01:55 crc kubenswrapper[4696]: E0318 16:01:55.886020 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4c0cbf71ef1d796f0389037930df0197aeff8ecaa36caf07fa1f84bac0508fb\": container with ID starting with e4c0cbf71ef1d796f0389037930df0197aeff8ecaa36caf07fa1f84bac0508fb not found: ID does not exist" containerID="e4c0cbf71ef1d796f0389037930df0197aeff8ecaa36caf07fa1f84bac0508fb" Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.886095 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4c0cbf71ef1d796f0389037930df0197aeff8ecaa36caf07fa1f84bac0508fb"} err="failed to get container status \"e4c0cbf71ef1d796f0389037930df0197aeff8ecaa36caf07fa1f84bac0508fb\": rpc error: code = NotFound desc = could not find container \"e4c0cbf71ef1d796f0389037930df0197aeff8ecaa36caf07fa1f84bac0508fb\": container with ID starting with e4c0cbf71ef1d796f0389037930df0197aeff8ecaa36caf07fa1f84bac0508fb not found: ID does not exist" Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.915354 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-openstack-edpm-ipam\") pod \"9aff3283-1485-404b-b76c-1516a700c101\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.915830 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-dns-svc\") pod \"9aff3283-1485-404b-b76c-1516a700c101\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.915891 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-config\") pod \"9aff3283-1485-404b-b76c-1516a700c101\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.915937 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc96w\" (UniqueName: \"kubernetes.io/projected/9aff3283-1485-404b-b76c-1516a700c101-kube-api-access-pc96w\") pod \"9aff3283-1485-404b-b76c-1516a700c101\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.916002 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-ovsdbserver-sb\") pod \"9aff3283-1485-404b-b76c-1516a700c101\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.916126 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-ovsdbserver-nb\") pod \"9aff3283-1485-404b-b76c-1516a700c101\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.916220 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-dns-swift-storage-0\") pod \"9aff3283-1485-404b-b76c-1516a700c101\" (UID: \"9aff3283-1485-404b-b76c-1516a700c101\") " Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.922791 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aff3283-1485-404b-b76c-1516a700c101-kube-api-access-pc96w" (OuterVolumeSpecName: "kube-api-access-pc96w") pod "9aff3283-1485-404b-b76c-1516a700c101" (UID: "9aff3283-1485-404b-b76c-1516a700c101"). InnerVolumeSpecName "kube-api-access-pc96w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.967744 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9aff3283-1485-404b-b76c-1516a700c101" (UID: "9aff3283-1485-404b-b76c-1516a700c101"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.974151 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-config" (OuterVolumeSpecName: "config") pod "9aff3283-1485-404b-b76c-1516a700c101" (UID: "9aff3283-1485-404b-b76c-1516a700c101"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.974514 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "9aff3283-1485-404b-b76c-1516a700c101" (UID: "9aff3283-1485-404b-b76c-1516a700c101"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.978211 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9aff3283-1485-404b-b76c-1516a700c101" (UID: "9aff3283-1485-404b-b76c-1516a700c101"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:55 crc kubenswrapper[4696]: I0318 16:01:55.981748 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9aff3283-1485-404b-b76c-1516a700c101" (UID: "9aff3283-1485-404b-b76c-1516a700c101"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:56 crc kubenswrapper[4696]: I0318 16:01:56.004827 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9aff3283-1485-404b-b76c-1516a700c101" (UID: "9aff3283-1485-404b-b76c-1516a700c101"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:56 crc kubenswrapper[4696]: I0318 16:01:56.019478 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:56 crc kubenswrapper[4696]: I0318 16:01:56.019548 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:56 crc kubenswrapper[4696]: I0318 16:01:56.019564 4696 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:56 crc kubenswrapper[4696]: I0318 16:01:56.019579 4696 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:56 crc kubenswrapper[4696]: I0318 16:01:56.019595 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:56 crc kubenswrapper[4696]: I0318 16:01:56.019608 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aff3283-1485-404b-b76c-1516a700c101-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:56 crc kubenswrapper[4696]: I0318 16:01:56.019621 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc96w\" (UniqueName: \"kubernetes.io/projected/9aff3283-1485-404b-b76c-1516a700c101-kube-api-access-pc96w\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:56 crc kubenswrapper[4696]: I0318 16:01:56.170927 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-8x7l8"] Mar 18 16:01:56 crc kubenswrapper[4696]: I0318 16:01:56.178472 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-8x7l8"] Mar 18 16:01:57 crc kubenswrapper[4696]: I0318 16:01:57.612567 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aff3283-1485-404b-b76c-1516a700c101" path="/var/lib/kubelet/pods/9aff3283-1485-404b-b76c-1516a700c101/volumes" Mar 18 16:02:00 crc kubenswrapper[4696]: I0318 16:02:00.137386 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564162-mdws6"] Mar 18 16:02:00 crc kubenswrapper[4696]: E0318 16:02:00.138395 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a024523d-d753-4a81-a8e5-2b416559d14f" containerName="init" Mar 18 16:02:00 crc kubenswrapper[4696]: I0318 16:02:00.138411 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a024523d-d753-4a81-a8e5-2b416559d14f" containerName="init" Mar 18 16:02:00 crc kubenswrapper[4696]: E0318 16:02:00.138431 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aff3283-1485-404b-b76c-1516a700c101" containerName="dnsmasq-dns" Mar 18 16:02:00 crc kubenswrapper[4696]: I0318 16:02:00.138439 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aff3283-1485-404b-b76c-1516a700c101" containerName="dnsmasq-dns" Mar 18 16:02:00 crc kubenswrapper[4696]: E0318 16:02:00.138449 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aff3283-1485-404b-b76c-1516a700c101" containerName="init" Mar 18 16:02:00 crc kubenswrapper[4696]: I0318 16:02:00.138465 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aff3283-1485-404b-b76c-1516a700c101" containerName="init" Mar 18 16:02:00 crc kubenswrapper[4696]: E0318 16:02:00.138499 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a024523d-d753-4a81-a8e5-2b416559d14f" containerName="dnsmasq-dns" Mar 18 16:02:00 crc kubenswrapper[4696]: I0318 16:02:00.138506 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a024523d-d753-4a81-a8e5-2b416559d14f" containerName="dnsmasq-dns" Mar 18 16:02:00 crc kubenswrapper[4696]: I0318 16:02:00.138698 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aff3283-1485-404b-b76c-1516a700c101" containerName="dnsmasq-dns" Mar 18 16:02:00 crc kubenswrapper[4696]: I0318 16:02:00.138720 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a024523d-d753-4a81-a8e5-2b416559d14f" containerName="dnsmasq-dns" Mar 18 16:02:00 crc kubenswrapper[4696]: I0318 16:02:00.139447 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564162-mdws6" Mar 18 16:02:00 crc kubenswrapper[4696]: I0318 16:02:00.141975 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:02:00 crc kubenswrapper[4696]: I0318 16:02:00.142154 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:02:00 crc kubenswrapper[4696]: I0318 16:02:00.143194 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:02:00 crc kubenswrapper[4696]: I0318 16:02:00.150700 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564162-mdws6"] Mar 18 16:02:00 crc kubenswrapper[4696]: I0318 16:02:00.303585 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf4rv\" (UniqueName: \"kubernetes.io/projected/5c3a1920-bf4b-4b2c-a0df-889337ff9f2e-kube-api-access-gf4rv\") pod \"auto-csr-approver-29564162-mdws6\" (UID: \"5c3a1920-bf4b-4b2c-a0df-889337ff9f2e\") " pod="openshift-infra/auto-csr-approver-29564162-mdws6" Mar 18 16:02:00 crc kubenswrapper[4696]: I0318 16:02:00.407074 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf4rv\" (UniqueName: \"kubernetes.io/projected/5c3a1920-bf4b-4b2c-a0df-889337ff9f2e-kube-api-access-gf4rv\") pod \"auto-csr-approver-29564162-mdws6\" (UID: \"5c3a1920-bf4b-4b2c-a0df-889337ff9f2e\") " pod="openshift-infra/auto-csr-approver-29564162-mdws6" Mar 18 16:02:00 crc kubenswrapper[4696]: I0318 16:02:00.429792 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf4rv\" (UniqueName: \"kubernetes.io/projected/5c3a1920-bf4b-4b2c-a0df-889337ff9f2e-kube-api-access-gf4rv\") pod \"auto-csr-approver-29564162-mdws6\" (UID: \"5c3a1920-bf4b-4b2c-a0df-889337ff9f2e\") " pod="openshift-infra/auto-csr-approver-29564162-mdws6" Mar 18 16:02:00 crc kubenswrapper[4696]: I0318 16:02:00.461717 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564162-mdws6" Mar 18 16:02:01 crc kubenswrapper[4696]: W0318 16:02:01.005259 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c3a1920_bf4b_4b2c_a0df_889337ff9f2e.slice/crio-9adfbe75d7ea35e21bbe9ed310ce954bb5fa36bf41dd82bcc10e004283706d26 WatchSource:0}: Error finding container 9adfbe75d7ea35e21bbe9ed310ce954bb5fa36bf41dd82bcc10e004283706d26: Status 404 returned error can't find the container with id 9adfbe75d7ea35e21bbe9ed310ce954bb5fa36bf41dd82bcc10e004283706d26 Mar 18 16:02:01 crc kubenswrapper[4696]: I0318 16:02:01.016297 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564162-mdws6"] Mar 18 16:02:01 crc kubenswrapper[4696]: I0318 16:02:01.899567 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564162-mdws6" event={"ID":"5c3a1920-bf4b-4b2c-a0df-889337ff9f2e","Type":"ContainerStarted","Data":"9adfbe75d7ea35e21bbe9ed310ce954bb5fa36bf41dd82bcc10e004283706d26"} Mar 18 16:02:02 crc kubenswrapper[4696]: I0318 16:02:02.910718 4696 generic.go:334] "Generic (PLEG): container finished" podID="5c3a1920-bf4b-4b2c-a0df-889337ff9f2e" containerID="c67eb1bc03daa547bc5db788dc359da158c9646c80c82286bcd12bd859ba670f" exitCode=0 Mar 18 16:02:02 crc kubenswrapper[4696]: I0318 16:02:02.910817 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564162-mdws6" event={"ID":"5c3a1920-bf4b-4b2c-a0df-889337ff9f2e","Type":"ContainerDied","Data":"c67eb1bc03daa547bc5db788dc359da158c9646c80c82286bcd12bd859ba670f"} Mar 18 16:02:03 crc kubenswrapper[4696]: I0318 16:02:03.644801 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs"] Mar 18 16:02:03 crc kubenswrapper[4696]: I0318 16:02:03.646274 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs" Mar 18 16:02:03 crc kubenswrapper[4696]: I0318 16:02:03.649118 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:02:03 crc kubenswrapper[4696]: I0318 16:02:03.649206 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:02:03 crc kubenswrapper[4696]: I0318 16:02:03.649764 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g8zsj" Mar 18 16:02:03 crc kubenswrapper[4696]: I0318 16:02:03.654948 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:02:03 crc kubenswrapper[4696]: I0318 16:02:03.660715 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs"] Mar 18 16:02:03 crc kubenswrapper[4696]: I0318 16:02:03.775095 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0489a724-0e24-4090-afc8-8d7baec47630-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs\" (UID: \"0489a724-0e24-4090-afc8-8d7baec47630\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs" Mar 18 16:02:03 crc kubenswrapper[4696]: I0318 16:02:03.775537 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mtpn\" (UniqueName: \"kubernetes.io/projected/0489a724-0e24-4090-afc8-8d7baec47630-kube-api-access-7mtpn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs\" (UID: \"0489a724-0e24-4090-afc8-8d7baec47630\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs" Mar 18 16:02:03 crc kubenswrapper[4696]: I0318 16:02:03.775703 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0489a724-0e24-4090-afc8-8d7baec47630-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs\" (UID: \"0489a724-0e24-4090-afc8-8d7baec47630\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs" Mar 18 16:02:03 crc kubenswrapper[4696]: I0318 16:02:03.776002 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0489a724-0e24-4090-afc8-8d7baec47630-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs\" (UID: \"0489a724-0e24-4090-afc8-8d7baec47630\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs" Mar 18 16:02:03 crc kubenswrapper[4696]: I0318 16:02:03.878318 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0489a724-0e24-4090-afc8-8d7baec47630-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs\" (UID: \"0489a724-0e24-4090-afc8-8d7baec47630\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs" Mar 18 16:02:03 crc kubenswrapper[4696]: I0318 16:02:03.878426 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0489a724-0e24-4090-afc8-8d7baec47630-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs\" (UID: \"0489a724-0e24-4090-afc8-8d7baec47630\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs" Mar 18 16:02:03 crc kubenswrapper[4696]: I0318 16:02:03.878594 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mtpn\" (UniqueName: \"kubernetes.io/projected/0489a724-0e24-4090-afc8-8d7baec47630-kube-api-access-7mtpn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs\" (UID: \"0489a724-0e24-4090-afc8-8d7baec47630\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs" Mar 18 16:02:03 crc kubenswrapper[4696]: I0318 16:02:03.878621 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0489a724-0e24-4090-afc8-8d7baec47630-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs\" (UID: \"0489a724-0e24-4090-afc8-8d7baec47630\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs" Mar 18 16:02:03 crc kubenswrapper[4696]: I0318 16:02:03.884148 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0489a724-0e24-4090-afc8-8d7baec47630-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs\" (UID: \"0489a724-0e24-4090-afc8-8d7baec47630\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs" Mar 18 16:02:03 crc kubenswrapper[4696]: I0318 16:02:03.885194 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0489a724-0e24-4090-afc8-8d7baec47630-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs\" (UID: \"0489a724-0e24-4090-afc8-8d7baec47630\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs" Mar 18 16:02:03 crc kubenswrapper[4696]: I0318 16:02:03.892158 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0489a724-0e24-4090-afc8-8d7baec47630-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs\" (UID: \"0489a724-0e24-4090-afc8-8d7baec47630\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs" Mar 18 16:02:03 crc kubenswrapper[4696]: I0318 16:02:03.896639 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mtpn\" (UniqueName: \"kubernetes.io/projected/0489a724-0e24-4090-afc8-8d7baec47630-kube-api-access-7mtpn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs\" (UID: \"0489a724-0e24-4090-afc8-8d7baec47630\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs" Mar 18 16:02:03 crc kubenswrapper[4696]: I0318 16:02:03.972559 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs" Mar 18 16:02:04 crc kubenswrapper[4696]: I0318 16:02:04.247759 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564162-mdws6" Mar 18 16:02:04 crc kubenswrapper[4696]: I0318 16:02:04.389163 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf4rv\" (UniqueName: \"kubernetes.io/projected/5c3a1920-bf4b-4b2c-a0df-889337ff9f2e-kube-api-access-gf4rv\") pod \"5c3a1920-bf4b-4b2c-a0df-889337ff9f2e\" (UID: \"5c3a1920-bf4b-4b2c-a0df-889337ff9f2e\") " Mar 18 16:02:04 crc kubenswrapper[4696]: I0318 16:02:04.401542 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c3a1920-bf4b-4b2c-a0df-889337ff9f2e-kube-api-access-gf4rv" (OuterVolumeSpecName: "kube-api-access-gf4rv") pod "5c3a1920-bf4b-4b2c-a0df-889337ff9f2e" (UID: "5c3a1920-bf4b-4b2c-a0df-889337ff9f2e"). InnerVolumeSpecName "kube-api-access-gf4rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:04 crc kubenswrapper[4696]: I0318 16:02:04.501364 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf4rv\" (UniqueName: \"kubernetes.io/projected/5c3a1920-bf4b-4b2c-a0df-889337ff9f2e-kube-api-access-gf4rv\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:04 crc kubenswrapper[4696]: I0318 16:02:04.650841 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs"] Mar 18 16:02:04 crc kubenswrapper[4696]: W0318 16:02:04.651613 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0489a724_0e24_4090_afc8_8d7baec47630.slice/crio-6450e1210bfe4ae7fd719bf8ad1a20c8bdf26602e4b7fe2da5707c6be87a6e99 WatchSource:0}: Error finding container 6450e1210bfe4ae7fd719bf8ad1a20c8bdf26602e4b7fe2da5707c6be87a6e99: Status 404 returned error can't find the container with id 6450e1210bfe4ae7fd719bf8ad1a20c8bdf26602e4b7fe2da5707c6be87a6e99 Mar 18 16:02:04 crc kubenswrapper[4696]: I0318 16:02:04.936356 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs" event={"ID":"0489a724-0e24-4090-afc8-8d7baec47630","Type":"ContainerStarted","Data":"6450e1210bfe4ae7fd719bf8ad1a20c8bdf26602e4b7fe2da5707c6be87a6e99"} Mar 18 16:02:04 crc kubenswrapper[4696]: I0318 16:02:04.938805 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564162-mdws6" event={"ID":"5c3a1920-bf4b-4b2c-a0df-889337ff9f2e","Type":"ContainerDied","Data":"9adfbe75d7ea35e21bbe9ed310ce954bb5fa36bf41dd82bcc10e004283706d26"} Mar 18 16:02:04 crc kubenswrapper[4696]: I0318 16:02:04.938883 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9adfbe75d7ea35e21bbe9ed310ce954bb5fa36bf41dd82bcc10e004283706d26" Mar 18 16:02:04 crc kubenswrapper[4696]: I0318 16:02:04.938908 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564162-mdws6" Mar 18 16:02:05 crc kubenswrapper[4696]: I0318 16:02:05.320909 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564156-8bzbz"] Mar 18 16:02:05 crc kubenswrapper[4696]: I0318 16:02:05.331552 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564156-8bzbz"] Mar 18 16:02:05 crc kubenswrapper[4696]: I0318 16:02:05.619444 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc7a7cdc-3686-4af9-89ab-fea81132767c" path="/var/lib/kubelet/pods/bc7a7cdc-3686-4af9-89ab-fea81132767c/volumes" Mar 18 16:02:07 crc kubenswrapper[4696]: I0318 16:02:07.977413 4696 generic.go:334] "Generic (PLEG): container finished" podID="8db68e71-4312-400b-8575-06f87bf6a781" containerID="1cbea0ec22e01c5967e960d519ceed7f8598affad7b95b6650312f68daedd58d" exitCode=0 Mar 18 16:02:07 crc kubenswrapper[4696]: I0318 16:02:07.977494 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db68e71-4312-400b-8575-06f87bf6a781","Type":"ContainerDied","Data":"1cbea0ec22e01c5967e960d519ceed7f8598affad7b95b6650312f68daedd58d"} Mar 18 16:02:08 crc kubenswrapper[4696]: I0318 16:02:08.990664 4696 generic.go:334] "Generic (PLEG): container finished" podID="ad207e86-aeb6-4af2-a411-dee8342b4fe9" containerID="1a7ddc0ef8202373729a442c07e394694075bb0e854e50c02deb6690ae0d2647" exitCode=0 Mar 18 16:02:08 crc kubenswrapper[4696]: I0318 16:02:08.990741 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ad207e86-aeb6-4af2-a411-dee8342b4fe9","Type":"ContainerDied","Data":"1a7ddc0ef8202373729a442c07e394694075bb0e854e50c02deb6690ae0d2647"} Mar 18 16:02:08 crc kubenswrapper[4696]: I0318 16:02:08.995445 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db68e71-4312-400b-8575-06f87bf6a781","Type":"ContainerStarted","Data":"5b5ab09b48b358b77b6b94a39498affb63650adf88f50f67b003cd2df8c2f1dc"} Mar 18 16:02:08 crc kubenswrapper[4696]: I0318 16:02:08.995639 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 16:02:09 crc kubenswrapper[4696]: I0318 16:02:09.045162 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.04513654 podStartE2EDuration="37.04513654s" podCreationTimestamp="2026-03-18 16:01:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:02:09.037820786 +0000 UTC m=+1572.043994992" watchObservedRunningTime="2026-03-18 16:02:09.04513654 +0000 UTC m=+1572.051310756" Mar 18 16:02:10 crc kubenswrapper[4696]: I0318 16:02:10.007422 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ad207e86-aeb6-4af2-a411-dee8342b4fe9","Type":"ContainerStarted","Data":"e057e2686c4830b07b7ddb966ef1cfac539d24dce67435491ff6c4963cc6e82f"} Mar 18 16:02:10 crc kubenswrapper[4696]: I0318 16:02:10.008577 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:10 crc kubenswrapper[4696]: I0318 16:02:10.048435 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.04841544 podStartE2EDuration="37.04841544s" podCreationTimestamp="2026-03-18 16:01:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:02:10.034424079 +0000 UTC m=+1573.040598285" watchObservedRunningTime="2026-03-18 16:02:10.04841544 +0000 UTC m=+1573.054589646" Mar 18 16:02:12 crc kubenswrapper[4696]: I0318 16:02:12.184772 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:02:12 crc kubenswrapper[4696]: I0318 16:02:12.185137 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:02:19 crc kubenswrapper[4696]: I0318 16:02:19.116098 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs" event={"ID":"0489a724-0e24-4090-afc8-8d7baec47630","Type":"ContainerStarted","Data":"7f6dd53e56e924e31a4fbd6bf334a73a2555f1b853d35bb9722215648a490f68"} Mar 18 16:02:19 crc kubenswrapper[4696]: I0318 16:02:19.171486 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs" podStartSLOduration=2.7604567859999998 podStartE2EDuration="16.171457807s" podCreationTimestamp="2026-03-18 16:02:03 +0000 UTC" firstStartedPulling="2026-03-18 16:02:04.656046348 +0000 UTC m=+1567.662220574" lastFinishedPulling="2026-03-18 16:02:18.067047389 +0000 UTC m=+1581.073221595" observedRunningTime="2026-03-18 16:02:19.162170244 +0000 UTC m=+1582.168344450" watchObservedRunningTime="2026-03-18 16:02:19.171457807 +0000 UTC m=+1582.177632023" Mar 18 16:02:19 crc kubenswrapper[4696]: I0318 16:02:19.964072 4696 scope.go:117] "RemoveContainer" containerID="1baf2ee679ae0ca62c028acde1076539a1c73dbbd112ff315de974a1afefd1cd" Mar 18 16:02:19 crc kubenswrapper[4696]: I0318 16:02:19.989027 4696 scope.go:117] "RemoveContainer" containerID="ba3ab3f011f8667d0271bb0f763da555ddb50262d2ff5e11786bcf5016594a64" Mar 18 16:02:23 crc kubenswrapper[4696]: I0318 16:02:23.005748 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 16:02:23 crc kubenswrapper[4696]: I0318 16:02:23.927758 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:26 crc kubenswrapper[4696]: I0318 16:02:26.899683 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8sq5r"] Mar 18 16:02:26 crc kubenswrapper[4696]: E0318 16:02:26.900885 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c3a1920-bf4b-4b2c-a0df-889337ff9f2e" containerName="oc" Mar 18 16:02:26 crc kubenswrapper[4696]: I0318 16:02:26.900902 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c3a1920-bf4b-4b2c-a0df-889337ff9f2e" containerName="oc" Mar 18 16:02:26 crc kubenswrapper[4696]: I0318 16:02:26.901174 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c3a1920-bf4b-4b2c-a0df-889337ff9f2e" containerName="oc" Mar 18 16:02:26 crc kubenswrapper[4696]: I0318 16:02:26.903108 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8sq5r" Mar 18 16:02:26 crc kubenswrapper[4696]: I0318 16:02:26.910594 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8sq5r"] Mar 18 16:02:27 crc kubenswrapper[4696]: I0318 16:02:27.014409 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ff751f2-f660-4b99-9708-1885cd11b3bd-catalog-content\") pod \"community-operators-8sq5r\" (UID: \"2ff751f2-f660-4b99-9708-1885cd11b3bd\") " pod="openshift-marketplace/community-operators-8sq5r" Mar 18 16:02:27 crc kubenswrapper[4696]: I0318 16:02:27.014615 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f9gl\" (UniqueName: \"kubernetes.io/projected/2ff751f2-f660-4b99-9708-1885cd11b3bd-kube-api-access-8f9gl\") pod \"community-operators-8sq5r\" (UID: \"2ff751f2-f660-4b99-9708-1885cd11b3bd\") " pod="openshift-marketplace/community-operators-8sq5r" Mar 18 16:02:27 crc kubenswrapper[4696]: I0318 16:02:27.014786 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ff751f2-f660-4b99-9708-1885cd11b3bd-utilities\") pod \"community-operators-8sq5r\" (UID: \"2ff751f2-f660-4b99-9708-1885cd11b3bd\") " pod="openshift-marketplace/community-operators-8sq5r" Mar 18 16:02:27 crc kubenswrapper[4696]: I0318 16:02:27.116770 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ff751f2-f660-4b99-9708-1885cd11b3bd-utilities\") pod \"community-operators-8sq5r\" (UID: \"2ff751f2-f660-4b99-9708-1885cd11b3bd\") " pod="openshift-marketplace/community-operators-8sq5r" Mar 18 16:02:27 crc kubenswrapper[4696]: I0318 16:02:27.116911 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ff751f2-f660-4b99-9708-1885cd11b3bd-catalog-content\") pod \"community-operators-8sq5r\" (UID: \"2ff751f2-f660-4b99-9708-1885cd11b3bd\") " pod="openshift-marketplace/community-operators-8sq5r" Mar 18 16:02:27 crc kubenswrapper[4696]: I0318 16:02:27.116966 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f9gl\" (UniqueName: \"kubernetes.io/projected/2ff751f2-f660-4b99-9708-1885cd11b3bd-kube-api-access-8f9gl\") pod \"community-operators-8sq5r\" (UID: \"2ff751f2-f660-4b99-9708-1885cd11b3bd\") " pod="openshift-marketplace/community-operators-8sq5r" Mar 18 16:02:27 crc kubenswrapper[4696]: I0318 16:02:27.118069 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ff751f2-f660-4b99-9708-1885cd11b3bd-utilities\") pod \"community-operators-8sq5r\" (UID: \"2ff751f2-f660-4b99-9708-1885cd11b3bd\") " pod="openshift-marketplace/community-operators-8sq5r" Mar 18 16:02:27 crc kubenswrapper[4696]: I0318 16:02:27.118360 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ff751f2-f660-4b99-9708-1885cd11b3bd-catalog-content\") pod \"community-operators-8sq5r\" (UID: \"2ff751f2-f660-4b99-9708-1885cd11b3bd\") " pod="openshift-marketplace/community-operators-8sq5r" Mar 18 16:02:27 crc kubenswrapper[4696]: I0318 16:02:27.139684 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f9gl\" (UniqueName: \"kubernetes.io/projected/2ff751f2-f660-4b99-9708-1885cd11b3bd-kube-api-access-8f9gl\") pod \"community-operators-8sq5r\" (UID: \"2ff751f2-f660-4b99-9708-1885cd11b3bd\") " pod="openshift-marketplace/community-operators-8sq5r" Mar 18 16:02:27 crc kubenswrapper[4696]: I0318 16:02:27.222581 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8sq5r" Mar 18 16:02:27 crc kubenswrapper[4696]: I0318 16:02:27.766012 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8sq5r"] Mar 18 16:02:27 crc kubenswrapper[4696]: W0318 16:02:27.767303 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ff751f2_f660_4b99_9708_1885cd11b3bd.slice/crio-b8eeed16ed5ee75d05e09ad5f85e86c344aadda3d76ab4a4dc6c814ab12f3261 WatchSource:0}: Error finding container b8eeed16ed5ee75d05e09ad5f85e86c344aadda3d76ab4a4dc6c814ab12f3261: Status 404 returned error can't find the container with id b8eeed16ed5ee75d05e09ad5f85e86c344aadda3d76ab4a4dc6c814ab12f3261 Mar 18 16:02:28 crc kubenswrapper[4696]: I0318 16:02:28.216779 4696 generic.go:334] "Generic (PLEG): container finished" podID="2ff751f2-f660-4b99-9708-1885cd11b3bd" containerID="0948b3da49488abff934b56687057f3f0948d741af6cc6e198407805360fc69a" exitCode=0 Mar 18 16:02:28 crc kubenswrapper[4696]: I0318 16:02:28.216873 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8sq5r" event={"ID":"2ff751f2-f660-4b99-9708-1885cd11b3bd","Type":"ContainerDied","Data":"0948b3da49488abff934b56687057f3f0948d741af6cc6e198407805360fc69a"} Mar 18 16:02:28 crc kubenswrapper[4696]: I0318 16:02:28.219707 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8sq5r" event={"ID":"2ff751f2-f660-4b99-9708-1885cd11b3bd","Type":"ContainerStarted","Data":"b8eeed16ed5ee75d05e09ad5f85e86c344aadda3d76ab4a4dc6c814ab12f3261"} Mar 18 16:02:30 crc kubenswrapper[4696]: I0318 16:02:30.244264 4696 generic.go:334] "Generic (PLEG): container finished" podID="2ff751f2-f660-4b99-9708-1885cd11b3bd" containerID="527e49555b8a517f2b39d241d14c7db98557aadf69891a8d723686dee98a4fc5" exitCode=0 Mar 18 16:02:30 crc kubenswrapper[4696]: I0318 16:02:30.244333 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8sq5r" event={"ID":"2ff751f2-f660-4b99-9708-1885cd11b3bd","Type":"ContainerDied","Data":"527e49555b8a517f2b39d241d14c7db98557aadf69891a8d723686dee98a4fc5"} Mar 18 16:02:31 crc kubenswrapper[4696]: I0318 16:02:31.253692 4696 generic.go:334] "Generic (PLEG): container finished" podID="0489a724-0e24-4090-afc8-8d7baec47630" containerID="7f6dd53e56e924e31a4fbd6bf334a73a2555f1b853d35bb9722215648a490f68" exitCode=0 Mar 18 16:02:31 crc kubenswrapper[4696]: I0318 16:02:31.254810 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs" event={"ID":"0489a724-0e24-4090-afc8-8d7baec47630","Type":"ContainerDied","Data":"7f6dd53e56e924e31a4fbd6bf334a73a2555f1b853d35bb9722215648a490f68"} Mar 18 16:02:31 crc kubenswrapper[4696]: I0318 16:02:31.259126 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8sq5r" event={"ID":"2ff751f2-f660-4b99-9708-1885cd11b3bd","Type":"ContainerStarted","Data":"50949e8d980887ce0afd63e5c1e4428dd13eab703df8dc1fed7f9c5a820655ed"} Mar 18 16:02:31 crc kubenswrapper[4696]: I0318 16:02:31.303166 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8sq5r" podStartSLOduration=2.646493982 podStartE2EDuration="5.303140191s" podCreationTimestamp="2026-03-18 16:02:26 +0000 UTC" firstStartedPulling="2026-03-18 16:02:28.219432578 +0000 UTC m=+1591.225606784" lastFinishedPulling="2026-03-18 16:02:30.876078787 +0000 UTC m=+1593.882252993" observedRunningTime="2026-03-18 16:02:31.300030383 +0000 UTC m=+1594.306204599" watchObservedRunningTime="2026-03-18 16:02:31.303140191 +0000 UTC m=+1594.309314397" Mar 18 16:02:32 crc kubenswrapper[4696]: I0318 16:02:32.721561 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs" Mar 18 16:02:32 crc kubenswrapper[4696]: I0318 16:02:32.854491 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0489a724-0e24-4090-afc8-8d7baec47630-ssh-key-openstack-edpm-ipam\") pod \"0489a724-0e24-4090-afc8-8d7baec47630\" (UID: \"0489a724-0e24-4090-afc8-8d7baec47630\") " Mar 18 16:02:32 crc kubenswrapper[4696]: I0318 16:02:32.854670 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mtpn\" (UniqueName: \"kubernetes.io/projected/0489a724-0e24-4090-afc8-8d7baec47630-kube-api-access-7mtpn\") pod \"0489a724-0e24-4090-afc8-8d7baec47630\" (UID: \"0489a724-0e24-4090-afc8-8d7baec47630\") " Mar 18 16:02:32 crc kubenswrapper[4696]: I0318 16:02:32.854874 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0489a724-0e24-4090-afc8-8d7baec47630-inventory\") pod \"0489a724-0e24-4090-afc8-8d7baec47630\" (UID: \"0489a724-0e24-4090-afc8-8d7baec47630\") " Mar 18 16:02:32 crc kubenswrapper[4696]: I0318 16:02:32.854964 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0489a724-0e24-4090-afc8-8d7baec47630-repo-setup-combined-ca-bundle\") pod \"0489a724-0e24-4090-afc8-8d7baec47630\" (UID: \"0489a724-0e24-4090-afc8-8d7baec47630\") " Mar 18 16:02:32 crc kubenswrapper[4696]: I0318 16:02:32.866779 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0489a724-0e24-4090-afc8-8d7baec47630-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "0489a724-0e24-4090-afc8-8d7baec47630" (UID: "0489a724-0e24-4090-afc8-8d7baec47630"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:32 crc kubenswrapper[4696]: I0318 16:02:32.866792 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0489a724-0e24-4090-afc8-8d7baec47630-kube-api-access-7mtpn" (OuterVolumeSpecName: "kube-api-access-7mtpn") pod "0489a724-0e24-4090-afc8-8d7baec47630" (UID: "0489a724-0e24-4090-afc8-8d7baec47630"). InnerVolumeSpecName "kube-api-access-7mtpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:32 crc kubenswrapper[4696]: I0318 16:02:32.903640 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0489a724-0e24-4090-afc8-8d7baec47630-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0489a724-0e24-4090-afc8-8d7baec47630" (UID: "0489a724-0e24-4090-afc8-8d7baec47630"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:32 crc kubenswrapper[4696]: I0318 16:02:32.916176 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0489a724-0e24-4090-afc8-8d7baec47630-inventory" (OuterVolumeSpecName: "inventory") pod "0489a724-0e24-4090-afc8-8d7baec47630" (UID: "0489a724-0e24-4090-afc8-8d7baec47630"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:32 crc kubenswrapper[4696]: I0318 16:02:32.957757 4696 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0489a724-0e24-4090-afc8-8d7baec47630-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:32 crc kubenswrapper[4696]: I0318 16:02:32.957838 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0489a724-0e24-4090-afc8-8d7baec47630-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:32 crc kubenswrapper[4696]: I0318 16:02:32.957855 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mtpn\" (UniqueName: \"kubernetes.io/projected/0489a724-0e24-4090-afc8-8d7baec47630-kube-api-access-7mtpn\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:32 crc kubenswrapper[4696]: I0318 16:02:32.957871 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0489a724-0e24-4090-afc8-8d7baec47630-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:33 crc kubenswrapper[4696]: I0318 16:02:33.279320 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs" event={"ID":"0489a724-0e24-4090-afc8-8d7baec47630","Type":"ContainerDied","Data":"6450e1210bfe4ae7fd719bf8ad1a20c8bdf26602e4b7fe2da5707c6be87a6e99"} Mar 18 16:02:33 crc kubenswrapper[4696]: I0318 16:02:33.279363 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6450e1210bfe4ae7fd719bf8ad1a20c8bdf26602e4b7fe2da5707c6be87a6e99" Mar 18 16:02:33 crc kubenswrapper[4696]: I0318 16:02:33.279430 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs" Mar 18 16:02:33 crc kubenswrapper[4696]: I0318 16:02:33.360035 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-vxq5r"] Mar 18 16:02:33 crc kubenswrapper[4696]: E0318 16:02:33.360637 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0489a724-0e24-4090-afc8-8d7baec47630" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 16:02:33 crc kubenswrapper[4696]: I0318 16:02:33.360666 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0489a724-0e24-4090-afc8-8d7baec47630" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 16:02:33 crc kubenswrapper[4696]: I0318 16:02:33.360942 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="0489a724-0e24-4090-afc8-8d7baec47630" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 16:02:33 crc kubenswrapper[4696]: I0318 16:02:33.361764 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vxq5r" Mar 18 16:02:33 crc kubenswrapper[4696]: I0318 16:02:33.364653 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g8zsj" Mar 18 16:02:33 crc kubenswrapper[4696]: I0318 16:02:33.364893 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:02:33 crc kubenswrapper[4696]: I0318 16:02:33.365024 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:02:33 crc kubenswrapper[4696]: I0318 16:02:33.365167 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:02:33 crc kubenswrapper[4696]: I0318 16:02:33.370416 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-vxq5r"] Mar 18 16:02:33 crc kubenswrapper[4696]: I0318 16:02:33.467435 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw9km\" (UniqueName: \"kubernetes.io/projected/ae98a130-1216-4906-8e7b-3721a2857935-kube-api-access-rw9km\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vxq5r\" (UID: \"ae98a130-1216-4906-8e7b-3721a2857935\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vxq5r" Mar 18 16:02:33 crc kubenswrapper[4696]: I0318 16:02:33.467812 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae98a130-1216-4906-8e7b-3721a2857935-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vxq5r\" (UID: \"ae98a130-1216-4906-8e7b-3721a2857935\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vxq5r" Mar 18 16:02:33 crc kubenswrapper[4696]: I0318 16:02:33.468073 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae98a130-1216-4906-8e7b-3721a2857935-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vxq5r\" (UID: \"ae98a130-1216-4906-8e7b-3721a2857935\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vxq5r" Mar 18 16:02:33 crc kubenswrapper[4696]: I0318 16:02:33.569892 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae98a130-1216-4906-8e7b-3721a2857935-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vxq5r\" (UID: \"ae98a130-1216-4906-8e7b-3721a2857935\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vxq5r" Mar 18 16:02:33 crc kubenswrapper[4696]: I0318 16:02:33.570071 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw9km\" (UniqueName: \"kubernetes.io/projected/ae98a130-1216-4906-8e7b-3721a2857935-kube-api-access-rw9km\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vxq5r\" (UID: \"ae98a130-1216-4906-8e7b-3721a2857935\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vxq5r" Mar 18 16:02:33 crc kubenswrapper[4696]: I0318 16:02:33.570100 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae98a130-1216-4906-8e7b-3721a2857935-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vxq5r\" (UID: \"ae98a130-1216-4906-8e7b-3721a2857935\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vxq5r" Mar 18 16:02:33 crc kubenswrapper[4696]: I0318 16:02:33.576928 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae98a130-1216-4906-8e7b-3721a2857935-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vxq5r\" (UID: \"ae98a130-1216-4906-8e7b-3721a2857935\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vxq5r" Mar 18 16:02:33 crc kubenswrapper[4696]: I0318 16:02:33.587988 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae98a130-1216-4906-8e7b-3721a2857935-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vxq5r\" (UID: \"ae98a130-1216-4906-8e7b-3721a2857935\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vxq5r" Mar 18 16:02:33 crc kubenswrapper[4696]: I0318 16:02:33.588169 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw9km\" (UniqueName: \"kubernetes.io/projected/ae98a130-1216-4906-8e7b-3721a2857935-kube-api-access-rw9km\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vxq5r\" (UID: \"ae98a130-1216-4906-8e7b-3721a2857935\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vxq5r" Mar 18 16:02:33 crc kubenswrapper[4696]: I0318 16:02:33.684800 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vxq5r" Mar 18 16:02:34 crc kubenswrapper[4696]: W0318 16:02:34.273279 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae98a130_1216_4906_8e7b_3721a2857935.slice/crio-672bb12510e4c8d9d9c8b1ceeb9bb3b80097fa330339cfb6090fd9c509168250 WatchSource:0}: Error finding container 672bb12510e4c8d9d9c8b1ceeb9bb3b80097fa330339cfb6090fd9c509168250: Status 404 returned error can't find the container with id 672bb12510e4c8d9d9c8b1ceeb9bb3b80097fa330339cfb6090fd9c509168250 Mar 18 16:02:34 crc kubenswrapper[4696]: I0318 16:02:34.282709 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-vxq5r"] Mar 18 16:02:34 crc kubenswrapper[4696]: I0318 16:02:34.294807 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vxq5r" event={"ID":"ae98a130-1216-4906-8e7b-3721a2857935","Type":"ContainerStarted","Data":"672bb12510e4c8d9d9c8b1ceeb9bb3b80097fa330339cfb6090fd9c509168250"} Mar 18 16:02:35 crc kubenswrapper[4696]: I0318 16:02:35.305763 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vxq5r" event={"ID":"ae98a130-1216-4906-8e7b-3721a2857935","Type":"ContainerStarted","Data":"73da2f25757614663dbce9bcbb1dfb3629e02fa3f18ef3997bc6134941414857"} Mar 18 16:02:35 crc kubenswrapper[4696]: I0318 16:02:35.329857 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vxq5r" podStartSLOduration=2.059308765 podStartE2EDuration="2.329835632s" podCreationTimestamp="2026-03-18 16:02:33 +0000 UTC" firstStartedPulling="2026-03-18 16:02:34.277804139 +0000 UTC m=+1597.283978345" lastFinishedPulling="2026-03-18 16:02:34.548331006 +0000 UTC m=+1597.554505212" observedRunningTime="2026-03-18 16:02:35.320834466 +0000 UTC m=+1598.327008672" watchObservedRunningTime="2026-03-18 16:02:35.329835632 +0000 UTC m=+1598.336009838" Mar 18 16:02:37 crc kubenswrapper[4696]: I0318 16:02:37.224062 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8sq5r" Mar 18 16:02:37 crc kubenswrapper[4696]: I0318 16:02:37.224426 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8sq5r" Mar 18 16:02:37 crc kubenswrapper[4696]: I0318 16:02:37.278680 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8sq5r" Mar 18 16:02:37 crc kubenswrapper[4696]: I0318 16:02:37.369855 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8sq5r" Mar 18 16:02:37 crc kubenswrapper[4696]: I0318 16:02:37.517356 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8sq5r"] Mar 18 16:02:38 crc kubenswrapper[4696]: I0318 16:02:38.335370 4696 generic.go:334] "Generic (PLEG): container finished" podID="ae98a130-1216-4906-8e7b-3721a2857935" containerID="73da2f25757614663dbce9bcbb1dfb3629e02fa3f18ef3997bc6134941414857" exitCode=0 Mar 18 16:02:38 crc kubenswrapper[4696]: I0318 16:02:38.335791 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vxq5r" event={"ID":"ae98a130-1216-4906-8e7b-3721a2857935","Type":"ContainerDied","Data":"73da2f25757614663dbce9bcbb1dfb3629e02fa3f18ef3997bc6134941414857"} Mar 18 16:02:39 crc kubenswrapper[4696]: I0318 16:02:39.345305 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8sq5r" podUID="2ff751f2-f660-4b99-9708-1885cd11b3bd" containerName="registry-server" containerID="cri-o://50949e8d980887ce0afd63e5c1e4428dd13eab703df8dc1fed7f9c5a820655ed" gracePeriod=2 Mar 18 16:02:39 crc kubenswrapper[4696]: I0318 16:02:39.891629 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vxq5r" Mar 18 16:02:39 crc kubenswrapper[4696]: I0318 16:02:39.901676 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8sq5r" Mar 18 16:02:39 crc kubenswrapper[4696]: I0318 16:02:39.940945 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae98a130-1216-4906-8e7b-3721a2857935-ssh-key-openstack-edpm-ipam\") pod \"ae98a130-1216-4906-8e7b-3721a2857935\" (UID: \"ae98a130-1216-4906-8e7b-3721a2857935\") " Mar 18 16:02:39 crc kubenswrapper[4696]: I0318 16:02:39.941142 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae98a130-1216-4906-8e7b-3721a2857935-inventory\") pod \"ae98a130-1216-4906-8e7b-3721a2857935\" (UID: \"ae98a130-1216-4906-8e7b-3721a2857935\") " Mar 18 16:02:39 crc kubenswrapper[4696]: I0318 16:02:39.941283 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw9km\" (UniqueName: \"kubernetes.io/projected/ae98a130-1216-4906-8e7b-3721a2857935-kube-api-access-rw9km\") pod \"ae98a130-1216-4906-8e7b-3721a2857935\" (UID: \"ae98a130-1216-4906-8e7b-3721a2857935\") " Mar 18 16:02:39 crc kubenswrapper[4696]: I0318 16:02:39.949347 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae98a130-1216-4906-8e7b-3721a2857935-kube-api-access-rw9km" (OuterVolumeSpecName: "kube-api-access-rw9km") pod "ae98a130-1216-4906-8e7b-3721a2857935" (UID: "ae98a130-1216-4906-8e7b-3721a2857935"). InnerVolumeSpecName "kube-api-access-rw9km". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:39 crc kubenswrapper[4696]: I0318 16:02:39.976060 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae98a130-1216-4906-8e7b-3721a2857935-inventory" (OuterVolumeSpecName: "inventory") pod "ae98a130-1216-4906-8e7b-3721a2857935" (UID: "ae98a130-1216-4906-8e7b-3721a2857935"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:39 crc kubenswrapper[4696]: I0318 16:02:39.977938 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae98a130-1216-4906-8e7b-3721a2857935-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ae98a130-1216-4906-8e7b-3721a2857935" (UID: "ae98a130-1216-4906-8e7b-3721a2857935"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.042683 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f9gl\" (UniqueName: \"kubernetes.io/projected/2ff751f2-f660-4b99-9708-1885cd11b3bd-kube-api-access-8f9gl\") pod \"2ff751f2-f660-4b99-9708-1885cd11b3bd\" (UID: \"2ff751f2-f660-4b99-9708-1885cd11b3bd\") " Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.042763 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ff751f2-f660-4b99-9708-1885cd11b3bd-catalog-content\") pod \"2ff751f2-f660-4b99-9708-1885cd11b3bd\" (UID: \"2ff751f2-f660-4b99-9708-1885cd11b3bd\") " Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.043136 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ff751f2-f660-4b99-9708-1885cd11b3bd-utilities\") pod \"2ff751f2-f660-4b99-9708-1885cd11b3bd\" (UID: \"2ff751f2-f660-4b99-9708-1885cd11b3bd\") " Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.043761 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae98a130-1216-4906-8e7b-3721a2857935-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.043829 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae98a130-1216-4906-8e7b-3721a2857935-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.043842 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw9km\" (UniqueName: \"kubernetes.io/projected/ae98a130-1216-4906-8e7b-3721a2857935-kube-api-access-rw9km\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.044073 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ff751f2-f660-4b99-9708-1885cd11b3bd-utilities" (OuterVolumeSpecName: "utilities") pod "2ff751f2-f660-4b99-9708-1885cd11b3bd" (UID: "2ff751f2-f660-4b99-9708-1885cd11b3bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.047117 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ff751f2-f660-4b99-9708-1885cd11b3bd-kube-api-access-8f9gl" (OuterVolumeSpecName: "kube-api-access-8f9gl") pod "2ff751f2-f660-4b99-9708-1885cd11b3bd" (UID: "2ff751f2-f660-4b99-9708-1885cd11b3bd"). InnerVolumeSpecName "kube-api-access-8f9gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.102709 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ff751f2-f660-4b99-9708-1885cd11b3bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ff751f2-f660-4b99-9708-1885cd11b3bd" (UID: "2ff751f2-f660-4b99-9708-1885cd11b3bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.145086 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ff751f2-f660-4b99-9708-1885cd11b3bd-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.145121 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f9gl\" (UniqueName: \"kubernetes.io/projected/2ff751f2-f660-4b99-9708-1885cd11b3bd-kube-api-access-8f9gl\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.145133 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ff751f2-f660-4b99-9708-1885cd11b3bd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.357005 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vxq5r" event={"ID":"ae98a130-1216-4906-8e7b-3721a2857935","Type":"ContainerDied","Data":"672bb12510e4c8d9d9c8b1ceeb9bb3b80097fa330339cfb6090fd9c509168250"} Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.357076 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="672bb12510e4c8d9d9c8b1ceeb9bb3b80097fa330339cfb6090fd9c509168250" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.357172 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vxq5r" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.365040 4696 generic.go:334] "Generic (PLEG): container finished" podID="2ff751f2-f660-4b99-9708-1885cd11b3bd" containerID="50949e8d980887ce0afd63e5c1e4428dd13eab703df8dc1fed7f9c5a820655ed" exitCode=0 Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.365096 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8sq5r" event={"ID":"2ff751f2-f660-4b99-9708-1885cd11b3bd","Type":"ContainerDied","Data":"50949e8d980887ce0afd63e5c1e4428dd13eab703df8dc1fed7f9c5a820655ed"} Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.365129 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8sq5r" event={"ID":"2ff751f2-f660-4b99-9708-1885cd11b3bd","Type":"ContainerDied","Data":"b8eeed16ed5ee75d05e09ad5f85e86c344aadda3d76ab4a4dc6c814ab12f3261"} Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.365150 4696 scope.go:117] "RemoveContainer" containerID="50949e8d980887ce0afd63e5c1e4428dd13eab703df8dc1fed7f9c5a820655ed" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.365357 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8sq5r" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.411616 4696 scope.go:117] "RemoveContainer" containerID="527e49555b8a517f2b39d241d14c7db98557aadf69891a8d723686dee98a4fc5" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.420478 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8sq5r"] Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.434431 4696 scope.go:117] "RemoveContainer" containerID="0948b3da49488abff934b56687057f3f0948d741af6cc6e198407805360fc69a" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.436924 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8sq5r"] Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.456703 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh"] Mar 18 16:02:40 crc kubenswrapper[4696]: E0318 16:02:40.457266 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae98a130-1216-4906-8e7b-3721a2857935" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.457293 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae98a130-1216-4906-8e7b-3721a2857935" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 18 16:02:40 crc kubenswrapper[4696]: E0318 16:02:40.457328 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff751f2-f660-4b99-9708-1885cd11b3bd" containerName="extract-utilities" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.457338 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff751f2-f660-4b99-9708-1885cd11b3bd" containerName="extract-utilities" Mar 18 16:02:40 crc kubenswrapper[4696]: E0318 16:02:40.457362 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff751f2-f660-4b99-9708-1885cd11b3bd" containerName="extract-content" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.457369 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff751f2-f660-4b99-9708-1885cd11b3bd" containerName="extract-content" Mar 18 16:02:40 crc kubenswrapper[4696]: E0318 16:02:40.457390 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff751f2-f660-4b99-9708-1885cd11b3bd" containerName="registry-server" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.457397 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff751f2-f660-4b99-9708-1885cd11b3bd" containerName="registry-server" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.457642 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae98a130-1216-4906-8e7b-3721a2857935" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.457666 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff751f2-f660-4b99-9708-1885cd11b3bd" containerName="registry-server" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.458462 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.460912 4696 scope.go:117] "RemoveContainer" containerID="50949e8d980887ce0afd63e5c1e4428dd13eab703df8dc1fed7f9c5a820655ed" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.465004 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh"] Mar 18 16:02:40 crc kubenswrapper[4696]: E0318 16:02:40.465991 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50949e8d980887ce0afd63e5c1e4428dd13eab703df8dc1fed7f9c5a820655ed\": container with ID starting with 50949e8d980887ce0afd63e5c1e4428dd13eab703df8dc1fed7f9c5a820655ed not found: ID does not exist" containerID="50949e8d980887ce0afd63e5c1e4428dd13eab703df8dc1fed7f9c5a820655ed" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.466048 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50949e8d980887ce0afd63e5c1e4428dd13eab703df8dc1fed7f9c5a820655ed"} err="failed to get container status \"50949e8d980887ce0afd63e5c1e4428dd13eab703df8dc1fed7f9c5a820655ed\": rpc error: code = NotFound desc = could not find container \"50949e8d980887ce0afd63e5c1e4428dd13eab703df8dc1fed7f9c5a820655ed\": container with ID starting with 50949e8d980887ce0afd63e5c1e4428dd13eab703df8dc1fed7f9c5a820655ed not found: ID does not exist" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.466090 4696 scope.go:117] "RemoveContainer" containerID="527e49555b8a517f2b39d241d14c7db98557aadf69891a8d723686dee98a4fc5" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.466252 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.466337 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.466262 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:02:40 crc kubenswrapper[4696]: E0318 16:02:40.466647 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"527e49555b8a517f2b39d241d14c7db98557aadf69891a8d723686dee98a4fc5\": container with ID starting with 527e49555b8a517f2b39d241d14c7db98557aadf69891a8d723686dee98a4fc5 not found: ID does not exist" containerID="527e49555b8a517f2b39d241d14c7db98557aadf69891a8d723686dee98a4fc5" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.466731 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"527e49555b8a517f2b39d241d14c7db98557aadf69891a8d723686dee98a4fc5"} err="failed to get container status \"527e49555b8a517f2b39d241d14c7db98557aadf69891a8d723686dee98a4fc5\": rpc error: code = NotFound desc = could not find container \"527e49555b8a517f2b39d241d14c7db98557aadf69891a8d723686dee98a4fc5\": container with ID starting with 527e49555b8a517f2b39d241d14c7db98557aadf69891a8d723686dee98a4fc5 not found: ID does not exist" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.466798 4696 scope.go:117] "RemoveContainer" containerID="0948b3da49488abff934b56687057f3f0948d741af6cc6e198407805360fc69a" Mar 18 16:02:40 crc kubenswrapper[4696]: E0318 16:02:40.467312 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0948b3da49488abff934b56687057f3f0948d741af6cc6e198407805360fc69a\": container with ID starting with 0948b3da49488abff934b56687057f3f0948d741af6cc6e198407805360fc69a not found: ID does not exist" containerID="0948b3da49488abff934b56687057f3f0948d741af6cc6e198407805360fc69a" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.467350 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0948b3da49488abff934b56687057f3f0948d741af6cc6e198407805360fc69a"} err="failed to get container status \"0948b3da49488abff934b56687057f3f0948d741af6cc6e198407805360fc69a\": rpc error: code = NotFound desc = could not find container \"0948b3da49488abff934b56687057f3f0948d741af6cc6e198407805360fc69a\": container with ID starting with 0948b3da49488abff934b56687057f3f0948d741af6cc6e198407805360fc69a not found: ID does not exist" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.476819 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g8zsj" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.552699 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8feb1b-5d39-4cb7-996f-dc5e34065193-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh\" (UID: \"1f8feb1b-5d39-4cb7-996f-dc5e34065193\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.553050 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f8feb1b-5d39-4cb7-996f-dc5e34065193-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh\" (UID: \"1f8feb1b-5d39-4cb7-996f-dc5e34065193\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.553092 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f8feb1b-5d39-4cb7-996f-dc5e34065193-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh\" (UID: \"1f8feb1b-5d39-4cb7-996f-dc5e34065193\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.553331 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkx6h\" (UniqueName: \"kubernetes.io/projected/1f8feb1b-5d39-4cb7-996f-dc5e34065193-kube-api-access-mkx6h\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh\" (UID: \"1f8feb1b-5d39-4cb7-996f-dc5e34065193\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.655739 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkx6h\" (UniqueName: \"kubernetes.io/projected/1f8feb1b-5d39-4cb7-996f-dc5e34065193-kube-api-access-mkx6h\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh\" (UID: \"1f8feb1b-5d39-4cb7-996f-dc5e34065193\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.655858 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8feb1b-5d39-4cb7-996f-dc5e34065193-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh\" (UID: \"1f8feb1b-5d39-4cb7-996f-dc5e34065193\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.655946 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f8feb1b-5d39-4cb7-996f-dc5e34065193-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh\" (UID: \"1f8feb1b-5d39-4cb7-996f-dc5e34065193\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.655993 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f8feb1b-5d39-4cb7-996f-dc5e34065193-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh\" (UID: \"1f8feb1b-5d39-4cb7-996f-dc5e34065193\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.660674 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8feb1b-5d39-4cb7-996f-dc5e34065193-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh\" (UID: \"1f8feb1b-5d39-4cb7-996f-dc5e34065193\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.661697 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f8feb1b-5d39-4cb7-996f-dc5e34065193-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh\" (UID: \"1f8feb1b-5d39-4cb7-996f-dc5e34065193\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.662074 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f8feb1b-5d39-4cb7-996f-dc5e34065193-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh\" (UID: \"1f8feb1b-5d39-4cb7-996f-dc5e34065193\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.677661 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkx6h\" (UniqueName: \"kubernetes.io/projected/1f8feb1b-5d39-4cb7-996f-dc5e34065193-kube-api-access-mkx6h\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh\" (UID: \"1f8feb1b-5d39-4cb7-996f-dc5e34065193\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh" Mar 18 16:02:40 crc kubenswrapper[4696]: I0318 16:02:40.789209 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh" Mar 18 16:02:41 crc kubenswrapper[4696]: I0318 16:02:41.313123 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh"] Mar 18 16:02:41 crc kubenswrapper[4696]: I0318 16:02:41.379796 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh" event={"ID":"1f8feb1b-5d39-4cb7-996f-dc5e34065193","Type":"ContainerStarted","Data":"1458288e0b68e3c75e061c3c77efc19670dedddc86668a912ce45b633cade000"} Mar 18 16:02:41 crc kubenswrapper[4696]: I0318 16:02:41.607361 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ff751f2-f660-4b99-9708-1885cd11b3bd" path="/var/lib/kubelet/pods/2ff751f2-f660-4b99-9708-1885cd11b3bd/volumes" Mar 18 16:02:42 crc kubenswrapper[4696]: I0318 16:02:42.184187 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:02:42 crc kubenswrapper[4696]: I0318 16:02:42.184580 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:02:42 crc kubenswrapper[4696]: I0318 16:02:42.414974 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh" event={"ID":"1f8feb1b-5d39-4cb7-996f-dc5e34065193","Type":"ContainerStarted","Data":"840ab168ad8c9128954d9dc231d6928c0b44f1c5044ceacc95acf508a26ff51d"} Mar 18 16:02:42 crc kubenswrapper[4696]: I0318 16:02:42.447045 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh" podStartSLOduration=2.015542541 podStartE2EDuration="2.447011955s" podCreationTimestamp="2026-03-18 16:02:40 +0000 UTC" firstStartedPulling="2026-03-18 16:02:41.318921764 +0000 UTC m=+1604.325095970" lastFinishedPulling="2026-03-18 16:02:41.750391178 +0000 UTC m=+1604.756565384" observedRunningTime="2026-03-18 16:02:42.436926012 +0000 UTC m=+1605.443100258" watchObservedRunningTime="2026-03-18 16:02:42.447011955 +0000 UTC m=+1605.453186191" Mar 18 16:02:56 crc kubenswrapper[4696]: I0318 16:02:56.555160 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2glbq"] Mar 18 16:02:56 crc kubenswrapper[4696]: I0318 16:02:56.559133 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2glbq" Mar 18 16:02:56 crc kubenswrapper[4696]: I0318 16:02:56.571630 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2glbq"] Mar 18 16:02:56 crc kubenswrapper[4696]: I0318 16:02:56.593306 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfzmt\" (UniqueName: \"kubernetes.io/projected/1746fa62-32cf-42f1-b6c8-60a3fc9cedd9-kube-api-access-pfzmt\") pod \"certified-operators-2glbq\" (UID: \"1746fa62-32cf-42f1-b6c8-60a3fc9cedd9\") " pod="openshift-marketplace/certified-operators-2glbq" Mar 18 16:02:56 crc kubenswrapper[4696]: I0318 16:02:56.593404 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1746fa62-32cf-42f1-b6c8-60a3fc9cedd9-catalog-content\") pod \"certified-operators-2glbq\" (UID: \"1746fa62-32cf-42f1-b6c8-60a3fc9cedd9\") " pod="openshift-marketplace/certified-operators-2glbq" Mar 18 16:02:56 crc kubenswrapper[4696]: I0318 16:02:56.594068 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1746fa62-32cf-42f1-b6c8-60a3fc9cedd9-utilities\") pod \"certified-operators-2glbq\" (UID: \"1746fa62-32cf-42f1-b6c8-60a3fc9cedd9\") " pod="openshift-marketplace/certified-operators-2glbq" Mar 18 16:02:56 crc kubenswrapper[4696]: I0318 16:02:56.696069 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfzmt\" (UniqueName: \"kubernetes.io/projected/1746fa62-32cf-42f1-b6c8-60a3fc9cedd9-kube-api-access-pfzmt\") pod \"certified-operators-2glbq\" (UID: \"1746fa62-32cf-42f1-b6c8-60a3fc9cedd9\") " pod="openshift-marketplace/certified-operators-2glbq" Mar 18 16:02:56 crc kubenswrapper[4696]: I0318 16:02:56.696133 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1746fa62-32cf-42f1-b6c8-60a3fc9cedd9-catalog-content\") pod \"certified-operators-2glbq\" (UID: \"1746fa62-32cf-42f1-b6c8-60a3fc9cedd9\") " pod="openshift-marketplace/certified-operators-2glbq" Mar 18 16:02:56 crc kubenswrapper[4696]: I0318 16:02:56.696163 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1746fa62-32cf-42f1-b6c8-60a3fc9cedd9-utilities\") pod \"certified-operators-2glbq\" (UID: \"1746fa62-32cf-42f1-b6c8-60a3fc9cedd9\") " pod="openshift-marketplace/certified-operators-2glbq" Mar 18 16:02:56 crc kubenswrapper[4696]: I0318 16:02:56.696835 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1746fa62-32cf-42f1-b6c8-60a3fc9cedd9-catalog-content\") pod \"certified-operators-2glbq\" (UID: \"1746fa62-32cf-42f1-b6c8-60a3fc9cedd9\") " pod="openshift-marketplace/certified-operators-2glbq" Mar 18 16:02:56 crc kubenswrapper[4696]: I0318 16:02:56.696851 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1746fa62-32cf-42f1-b6c8-60a3fc9cedd9-utilities\") pod \"certified-operators-2glbq\" (UID: \"1746fa62-32cf-42f1-b6c8-60a3fc9cedd9\") " pod="openshift-marketplace/certified-operators-2glbq" Mar 18 16:02:56 crc kubenswrapper[4696]: I0318 16:02:56.736830 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfzmt\" (UniqueName: \"kubernetes.io/projected/1746fa62-32cf-42f1-b6c8-60a3fc9cedd9-kube-api-access-pfzmt\") pod \"certified-operators-2glbq\" (UID: \"1746fa62-32cf-42f1-b6c8-60a3fc9cedd9\") " pod="openshift-marketplace/certified-operators-2glbq" Mar 18 16:02:56 crc kubenswrapper[4696]: I0318 16:02:56.891441 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2glbq" Mar 18 16:02:57 crc kubenswrapper[4696]: I0318 16:02:57.445198 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2glbq"] Mar 18 16:02:57 crc kubenswrapper[4696]: I0318 16:02:57.614121 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2glbq" event={"ID":"1746fa62-32cf-42f1-b6c8-60a3fc9cedd9","Type":"ContainerStarted","Data":"a7cdc537d4dd2378a1f9a5d04ede6012cb68dfeeb1c4ef45aaf467f4ae4a875d"} Mar 18 16:02:58 crc kubenswrapper[4696]: I0318 16:02:58.616180 4696 generic.go:334] "Generic (PLEG): container finished" podID="1746fa62-32cf-42f1-b6c8-60a3fc9cedd9" containerID="e1cad57bf6e2565598e8f5d7be4ab29511bb53b2d773a1a23a0a2ea71c43c95b" exitCode=0 Mar 18 16:02:58 crc kubenswrapper[4696]: I0318 16:02:58.616294 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2glbq" event={"ID":"1746fa62-32cf-42f1-b6c8-60a3fc9cedd9","Type":"ContainerDied","Data":"e1cad57bf6e2565598e8f5d7be4ab29511bb53b2d773a1a23a0a2ea71c43c95b"} Mar 18 16:03:00 crc kubenswrapper[4696]: I0318 16:03:00.647098 4696 generic.go:334] "Generic (PLEG): container finished" podID="1746fa62-32cf-42f1-b6c8-60a3fc9cedd9" containerID="f0ea14d062097505ffdbb91a2c3e7f04946e03f52b870531d585688773f16b6a" exitCode=0 Mar 18 16:03:00 crc kubenswrapper[4696]: I0318 16:03:00.647178 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2glbq" event={"ID":"1746fa62-32cf-42f1-b6c8-60a3fc9cedd9","Type":"ContainerDied","Data":"f0ea14d062097505ffdbb91a2c3e7f04946e03f52b870531d585688773f16b6a"} Mar 18 16:03:02 crc kubenswrapper[4696]: I0318 16:03:02.668467 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2glbq" event={"ID":"1746fa62-32cf-42f1-b6c8-60a3fc9cedd9","Type":"ContainerStarted","Data":"e732a1272e46b31776481c44744f8c475afbcd0066eb328309d46fb2acd03078"} Mar 18 16:03:02 crc kubenswrapper[4696]: I0318 16:03:02.692158 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2glbq" podStartSLOduration=3.30891636 podStartE2EDuration="6.692133277s" podCreationTimestamp="2026-03-18 16:02:56 +0000 UTC" firstStartedPulling="2026-03-18 16:02:58.6188831 +0000 UTC m=+1621.625057316" lastFinishedPulling="2026-03-18 16:03:02.002100017 +0000 UTC m=+1625.008274233" observedRunningTime="2026-03-18 16:03:02.69027035 +0000 UTC m=+1625.696444576" watchObservedRunningTime="2026-03-18 16:03:02.692133277 +0000 UTC m=+1625.698307503" Mar 18 16:03:06 crc kubenswrapper[4696]: I0318 16:03:06.892127 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2glbq" Mar 18 16:03:06 crc kubenswrapper[4696]: I0318 16:03:06.892707 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2glbq" Mar 18 16:03:06 crc kubenswrapper[4696]: I0318 16:03:06.941333 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2glbq" Mar 18 16:03:07 crc kubenswrapper[4696]: I0318 16:03:07.826878 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2glbq" Mar 18 16:03:07 crc kubenswrapper[4696]: I0318 16:03:07.908935 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2glbq"] Mar 18 16:03:09 crc kubenswrapper[4696]: I0318 16:03:09.738013 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2glbq" podUID="1746fa62-32cf-42f1-b6c8-60a3fc9cedd9" containerName="registry-server" containerID="cri-o://e732a1272e46b31776481c44744f8c475afbcd0066eb328309d46fb2acd03078" gracePeriod=2 Mar 18 16:03:10 crc kubenswrapper[4696]: I0318 16:03:10.751898 4696 generic.go:334] "Generic (PLEG): container finished" podID="1746fa62-32cf-42f1-b6c8-60a3fc9cedd9" containerID="e732a1272e46b31776481c44744f8c475afbcd0066eb328309d46fb2acd03078" exitCode=0 Mar 18 16:03:10 crc kubenswrapper[4696]: I0318 16:03:10.751968 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2glbq" event={"ID":"1746fa62-32cf-42f1-b6c8-60a3fc9cedd9","Type":"ContainerDied","Data":"e732a1272e46b31776481c44744f8c475afbcd0066eb328309d46fb2acd03078"} Mar 18 16:03:10 crc kubenswrapper[4696]: I0318 16:03:10.752376 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2glbq" event={"ID":"1746fa62-32cf-42f1-b6c8-60a3fc9cedd9","Type":"ContainerDied","Data":"a7cdc537d4dd2378a1f9a5d04ede6012cb68dfeeb1c4ef45aaf467f4ae4a875d"} Mar 18 16:03:10 crc kubenswrapper[4696]: I0318 16:03:10.752399 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7cdc537d4dd2378a1f9a5d04ede6012cb68dfeeb1c4ef45aaf467f4ae4a875d" Mar 18 16:03:10 crc kubenswrapper[4696]: I0318 16:03:10.811975 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2glbq" Mar 18 16:03:10 crc kubenswrapper[4696]: I0318 16:03:10.875390 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfzmt\" (UniqueName: \"kubernetes.io/projected/1746fa62-32cf-42f1-b6c8-60a3fc9cedd9-kube-api-access-pfzmt\") pod \"1746fa62-32cf-42f1-b6c8-60a3fc9cedd9\" (UID: \"1746fa62-32cf-42f1-b6c8-60a3fc9cedd9\") " Mar 18 16:03:10 crc kubenswrapper[4696]: I0318 16:03:10.875461 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1746fa62-32cf-42f1-b6c8-60a3fc9cedd9-catalog-content\") pod \"1746fa62-32cf-42f1-b6c8-60a3fc9cedd9\" (UID: \"1746fa62-32cf-42f1-b6c8-60a3fc9cedd9\") " Mar 18 16:03:10 crc kubenswrapper[4696]: I0318 16:03:10.875614 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1746fa62-32cf-42f1-b6c8-60a3fc9cedd9-utilities\") pod \"1746fa62-32cf-42f1-b6c8-60a3fc9cedd9\" (UID: \"1746fa62-32cf-42f1-b6c8-60a3fc9cedd9\") " Mar 18 16:03:10 crc kubenswrapper[4696]: I0318 16:03:10.877129 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1746fa62-32cf-42f1-b6c8-60a3fc9cedd9-utilities" (OuterVolumeSpecName: "utilities") pod "1746fa62-32cf-42f1-b6c8-60a3fc9cedd9" (UID: "1746fa62-32cf-42f1-b6c8-60a3fc9cedd9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:03:10 crc kubenswrapper[4696]: I0318 16:03:10.902111 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1746fa62-32cf-42f1-b6c8-60a3fc9cedd9-kube-api-access-pfzmt" (OuterVolumeSpecName: "kube-api-access-pfzmt") pod "1746fa62-32cf-42f1-b6c8-60a3fc9cedd9" (UID: "1746fa62-32cf-42f1-b6c8-60a3fc9cedd9"). InnerVolumeSpecName "kube-api-access-pfzmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:03:10 crc kubenswrapper[4696]: I0318 16:03:10.937566 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1746fa62-32cf-42f1-b6c8-60a3fc9cedd9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1746fa62-32cf-42f1-b6c8-60a3fc9cedd9" (UID: "1746fa62-32cf-42f1-b6c8-60a3fc9cedd9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:03:10 crc kubenswrapper[4696]: I0318 16:03:10.978681 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfzmt\" (UniqueName: \"kubernetes.io/projected/1746fa62-32cf-42f1-b6c8-60a3fc9cedd9-kube-api-access-pfzmt\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:10 crc kubenswrapper[4696]: I0318 16:03:10.979166 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1746fa62-32cf-42f1-b6c8-60a3fc9cedd9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:10 crc kubenswrapper[4696]: I0318 16:03:10.979178 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1746fa62-32cf-42f1-b6c8-60a3fc9cedd9-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:11 crc kubenswrapper[4696]: I0318 16:03:11.760623 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2glbq" Mar 18 16:03:11 crc kubenswrapper[4696]: I0318 16:03:11.783162 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2glbq"] Mar 18 16:03:11 crc kubenswrapper[4696]: I0318 16:03:11.793425 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2glbq"] Mar 18 16:03:12 crc kubenswrapper[4696]: I0318 16:03:12.185440 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:03:12 crc kubenswrapper[4696]: I0318 16:03:12.185827 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:03:12 crc kubenswrapper[4696]: I0318 16:03:12.186008 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 16:03:12 crc kubenswrapper[4696]: I0318 16:03:12.187139 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9"} pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:03:12 crc kubenswrapper[4696]: I0318 16:03:12.187325 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" containerID="cri-o://aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" gracePeriod=600 Mar 18 16:03:12 crc kubenswrapper[4696]: E0318 16:03:12.340512 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:03:12 crc kubenswrapper[4696]: I0318 16:03:12.775984 4696 generic.go:334] "Generic (PLEG): container finished" podID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" exitCode=0 Mar 18 16:03:12 crc kubenswrapper[4696]: I0318 16:03:12.776046 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerDied","Data":"aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9"} Mar 18 16:03:12 crc kubenswrapper[4696]: I0318 16:03:12.776447 4696 scope.go:117] "RemoveContainer" containerID="e7d6aeeccf3f0ce1fb7410ba06c6597ef8535cab4338e38adaf8fc42a5797086" Mar 18 16:03:12 crc kubenswrapper[4696]: I0318 16:03:12.777307 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:03:12 crc kubenswrapper[4696]: E0318 16:03:12.777789 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:03:13 crc kubenswrapper[4696]: I0318 16:03:13.611306 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1746fa62-32cf-42f1-b6c8-60a3fc9cedd9" path="/var/lib/kubelet/pods/1746fa62-32cf-42f1-b6c8-60a3fc9cedd9/volumes" Mar 18 16:03:20 crc kubenswrapper[4696]: I0318 16:03:20.167513 4696 scope.go:117] "RemoveContainer" containerID="17bf9d28029c28497e000dca6f63f89dbaef9b7ec63b3c3be0a9331401fe8c3f" Mar 18 16:03:20 crc kubenswrapper[4696]: I0318 16:03:20.191620 4696 scope.go:117] "RemoveContainer" containerID="33bbac5754dd862078d8a24c3ef40c131595f1ee86df49baefb6cec43608b237" Mar 18 16:03:23 crc kubenswrapper[4696]: I0318 16:03:23.603608 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:03:23 crc kubenswrapper[4696]: E0318 16:03:23.604358 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:03:36 crc kubenswrapper[4696]: I0318 16:03:36.597830 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:03:36 crc kubenswrapper[4696]: E0318 16:03:36.598933 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:03:51 crc kubenswrapper[4696]: I0318 16:03:51.597963 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:03:51 crc kubenswrapper[4696]: E0318 16:03:51.598761 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:04:00 crc kubenswrapper[4696]: I0318 16:04:00.143772 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564164-b48nr"] Mar 18 16:04:00 crc kubenswrapper[4696]: E0318 16:04:00.144843 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1746fa62-32cf-42f1-b6c8-60a3fc9cedd9" containerName="registry-server" Mar 18 16:04:00 crc kubenswrapper[4696]: I0318 16:04:00.144878 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="1746fa62-32cf-42f1-b6c8-60a3fc9cedd9" containerName="registry-server" Mar 18 16:04:00 crc kubenswrapper[4696]: E0318 16:04:00.144903 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1746fa62-32cf-42f1-b6c8-60a3fc9cedd9" containerName="extract-content" Mar 18 16:04:00 crc kubenswrapper[4696]: I0318 16:04:00.144911 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="1746fa62-32cf-42f1-b6c8-60a3fc9cedd9" containerName="extract-content" Mar 18 16:04:00 crc kubenswrapper[4696]: E0318 16:04:00.144925 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1746fa62-32cf-42f1-b6c8-60a3fc9cedd9" containerName="extract-utilities" Mar 18 16:04:00 crc kubenswrapper[4696]: I0318 16:04:00.144933 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="1746fa62-32cf-42f1-b6c8-60a3fc9cedd9" containerName="extract-utilities" Mar 18 16:04:00 crc kubenswrapper[4696]: I0318 16:04:00.145198 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="1746fa62-32cf-42f1-b6c8-60a3fc9cedd9" containerName="registry-server" Mar 18 16:04:00 crc kubenswrapper[4696]: I0318 16:04:00.146998 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564164-b48nr" Mar 18 16:04:00 crc kubenswrapper[4696]: I0318 16:04:00.152358 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:04:00 crc kubenswrapper[4696]: I0318 16:04:00.152438 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:04:00 crc kubenswrapper[4696]: I0318 16:04:00.153178 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:04:00 crc kubenswrapper[4696]: I0318 16:04:00.165376 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564164-b48nr"] Mar 18 16:04:00 crc kubenswrapper[4696]: I0318 16:04:00.175157 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncqmc\" (UniqueName: \"kubernetes.io/projected/64813596-49e9-4f19-8b88-c4861b9ec490-kube-api-access-ncqmc\") pod \"auto-csr-approver-29564164-b48nr\" (UID: \"64813596-49e9-4f19-8b88-c4861b9ec490\") " pod="openshift-infra/auto-csr-approver-29564164-b48nr" Mar 18 16:04:00 crc kubenswrapper[4696]: I0318 16:04:00.277113 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncqmc\" (UniqueName: \"kubernetes.io/projected/64813596-49e9-4f19-8b88-c4861b9ec490-kube-api-access-ncqmc\") pod \"auto-csr-approver-29564164-b48nr\" (UID: \"64813596-49e9-4f19-8b88-c4861b9ec490\") " pod="openshift-infra/auto-csr-approver-29564164-b48nr" Mar 18 16:04:00 crc kubenswrapper[4696]: I0318 16:04:00.298249 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncqmc\" (UniqueName: \"kubernetes.io/projected/64813596-49e9-4f19-8b88-c4861b9ec490-kube-api-access-ncqmc\") pod \"auto-csr-approver-29564164-b48nr\" (UID: \"64813596-49e9-4f19-8b88-c4861b9ec490\") " pod="openshift-infra/auto-csr-approver-29564164-b48nr" Mar 18 16:04:00 crc kubenswrapper[4696]: I0318 16:04:00.465673 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564164-b48nr" Mar 18 16:04:00 crc kubenswrapper[4696]: I0318 16:04:00.919540 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564164-b48nr"] Mar 18 16:04:00 crc kubenswrapper[4696]: W0318 16:04:00.924583 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64813596_49e9_4f19_8b88_c4861b9ec490.slice/crio-c37d6351da2c88cdb8e7c9bf4f283cde44e33d1f0bb1a9c8054709d4d52fc6dc WatchSource:0}: Error finding container c37d6351da2c88cdb8e7c9bf4f283cde44e33d1f0bb1a9c8054709d4d52fc6dc: Status 404 returned error can't find the container with id c37d6351da2c88cdb8e7c9bf4f283cde44e33d1f0bb1a9c8054709d4d52fc6dc Mar 18 16:04:01 crc kubenswrapper[4696]: I0318 16:04:01.251774 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564164-b48nr" event={"ID":"64813596-49e9-4f19-8b88-c4861b9ec490","Type":"ContainerStarted","Data":"c37d6351da2c88cdb8e7c9bf4f283cde44e33d1f0bb1a9c8054709d4d52fc6dc"} Mar 18 16:04:03 crc kubenswrapper[4696]: I0318 16:04:03.272573 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564164-b48nr" event={"ID":"64813596-49e9-4f19-8b88-c4861b9ec490","Type":"ContainerStarted","Data":"7607dc7f51f311add0fa7e16a6fe28fb14a30b35f32f5fdddddba2076e9cf6b8"} Mar 18 16:04:03 crc kubenswrapper[4696]: I0318 16:04:03.290362 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564164-b48nr" podStartSLOduration=1.208401821 podStartE2EDuration="3.290332713s" podCreationTimestamp="2026-03-18 16:04:00 +0000 UTC" firstStartedPulling="2026-03-18 16:04:00.929279209 +0000 UTC m=+1683.935453415" lastFinishedPulling="2026-03-18 16:04:03.011210101 +0000 UTC m=+1686.017384307" observedRunningTime="2026-03-18 16:04:03.286714233 +0000 UTC m=+1686.292888479" watchObservedRunningTime="2026-03-18 16:04:03.290332713 +0000 UTC m=+1686.296506919" Mar 18 16:04:04 crc kubenswrapper[4696]: I0318 16:04:04.288427 4696 generic.go:334] "Generic (PLEG): container finished" podID="64813596-49e9-4f19-8b88-c4861b9ec490" containerID="7607dc7f51f311add0fa7e16a6fe28fb14a30b35f32f5fdddddba2076e9cf6b8" exitCode=0 Mar 18 16:04:04 crc kubenswrapper[4696]: I0318 16:04:04.288505 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564164-b48nr" event={"ID":"64813596-49e9-4f19-8b88-c4861b9ec490","Type":"ContainerDied","Data":"7607dc7f51f311add0fa7e16a6fe28fb14a30b35f32f5fdddddba2076e9cf6b8"} Mar 18 16:04:05 crc kubenswrapper[4696]: I0318 16:04:05.599227 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:04:05 crc kubenswrapper[4696]: E0318 16:04:05.599824 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:04:05 crc kubenswrapper[4696]: I0318 16:04:05.663849 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564164-b48nr" Mar 18 16:04:05 crc kubenswrapper[4696]: I0318 16:04:05.790488 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncqmc\" (UniqueName: \"kubernetes.io/projected/64813596-49e9-4f19-8b88-c4861b9ec490-kube-api-access-ncqmc\") pod \"64813596-49e9-4f19-8b88-c4861b9ec490\" (UID: \"64813596-49e9-4f19-8b88-c4861b9ec490\") " Mar 18 16:04:05 crc kubenswrapper[4696]: I0318 16:04:05.796235 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64813596-49e9-4f19-8b88-c4861b9ec490-kube-api-access-ncqmc" (OuterVolumeSpecName: "kube-api-access-ncqmc") pod "64813596-49e9-4f19-8b88-c4861b9ec490" (UID: "64813596-49e9-4f19-8b88-c4861b9ec490"). InnerVolumeSpecName "kube-api-access-ncqmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:05 crc kubenswrapper[4696]: I0318 16:04:05.893877 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncqmc\" (UniqueName: \"kubernetes.io/projected/64813596-49e9-4f19-8b88-c4861b9ec490-kube-api-access-ncqmc\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:06 crc kubenswrapper[4696]: I0318 16:04:06.311848 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564164-b48nr" event={"ID":"64813596-49e9-4f19-8b88-c4861b9ec490","Type":"ContainerDied","Data":"c37d6351da2c88cdb8e7c9bf4f283cde44e33d1f0bb1a9c8054709d4d52fc6dc"} Mar 18 16:04:06 crc kubenswrapper[4696]: I0318 16:04:06.312205 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c37d6351da2c88cdb8e7c9bf4f283cde44e33d1f0bb1a9c8054709d4d52fc6dc" Mar 18 16:04:06 crc kubenswrapper[4696]: I0318 16:04:06.311875 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564164-b48nr" Mar 18 16:04:06 crc kubenswrapper[4696]: I0318 16:04:06.362396 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564158-srm4f"] Mar 18 16:04:06 crc kubenswrapper[4696]: I0318 16:04:06.373721 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564158-srm4f"] Mar 18 16:04:07 crc kubenswrapper[4696]: I0318 16:04:07.612834 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="461ec5b2-5f78-4e16-b815-120c9181c0d5" path="/var/lib/kubelet/pods/461ec5b2-5f78-4e16-b815-120c9181c0d5/volumes" Mar 18 16:04:18 crc kubenswrapper[4696]: I0318 16:04:18.597880 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:04:18 crc kubenswrapper[4696]: E0318 16:04:18.598651 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:04:20 crc kubenswrapper[4696]: I0318 16:04:20.359456 4696 scope.go:117] "RemoveContainer" containerID="4b9356aa13581adef14af62b6776b37bd2c42ddc713862187e0a3b3f219ab5ad" Mar 18 16:04:31 crc kubenswrapper[4696]: I0318 16:04:31.597543 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:04:31 crc kubenswrapper[4696]: E0318 16:04:31.598275 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:04:45 crc kubenswrapper[4696]: I0318 16:04:45.597698 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:04:45 crc kubenswrapper[4696]: E0318 16:04:45.598541 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:04:58 crc kubenswrapper[4696]: I0318 16:04:58.598190 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:04:58 crc kubenswrapper[4696]: E0318 16:04:58.598840 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:05:11 crc kubenswrapper[4696]: I0318 16:05:11.597854 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:05:11 crc kubenswrapper[4696]: E0318 16:05:11.598740 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:05:20 crc kubenswrapper[4696]: I0318 16:05:20.447547 4696 scope.go:117] "RemoveContainer" containerID="8e6768584d4dd707802d8193590777aea9fbce027a39c4ea8cea1278810bf30c" Mar 18 16:05:23 crc kubenswrapper[4696]: I0318 16:05:23.597461 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:05:23 crc kubenswrapper[4696]: E0318 16:05:23.598238 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:05:37 crc kubenswrapper[4696]: I0318 16:05:37.608114 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:05:37 crc kubenswrapper[4696]: E0318 16:05:37.611122 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:05:50 crc kubenswrapper[4696]: I0318 16:05:50.597814 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:05:50 crc kubenswrapper[4696]: E0318 16:05:50.598554 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:05:52 crc kubenswrapper[4696]: I0318 16:05:52.331834 4696 generic.go:334] "Generic (PLEG): container finished" podID="1f8feb1b-5d39-4cb7-996f-dc5e34065193" containerID="840ab168ad8c9128954d9dc231d6928c0b44f1c5044ceacc95acf508a26ff51d" exitCode=0 Mar 18 16:05:52 crc kubenswrapper[4696]: I0318 16:05:52.331908 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh" event={"ID":"1f8feb1b-5d39-4cb7-996f-dc5e34065193","Type":"ContainerDied","Data":"840ab168ad8c9128954d9dc231d6928c0b44f1c5044ceacc95acf508a26ff51d"} Mar 18 16:05:53 crc kubenswrapper[4696]: I0318 16:05:53.742405 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh" Mar 18 16:05:53 crc kubenswrapper[4696]: I0318 16:05:53.895841 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8feb1b-5d39-4cb7-996f-dc5e34065193-bootstrap-combined-ca-bundle\") pod \"1f8feb1b-5d39-4cb7-996f-dc5e34065193\" (UID: \"1f8feb1b-5d39-4cb7-996f-dc5e34065193\") " Mar 18 16:05:53 crc kubenswrapper[4696]: I0318 16:05:53.896306 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkx6h\" (UniqueName: \"kubernetes.io/projected/1f8feb1b-5d39-4cb7-996f-dc5e34065193-kube-api-access-mkx6h\") pod \"1f8feb1b-5d39-4cb7-996f-dc5e34065193\" (UID: \"1f8feb1b-5d39-4cb7-996f-dc5e34065193\") " Mar 18 16:05:53 crc kubenswrapper[4696]: I0318 16:05:53.896457 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f8feb1b-5d39-4cb7-996f-dc5e34065193-ssh-key-openstack-edpm-ipam\") pod \"1f8feb1b-5d39-4cb7-996f-dc5e34065193\" (UID: \"1f8feb1b-5d39-4cb7-996f-dc5e34065193\") " Mar 18 16:05:53 crc kubenswrapper[4696]: I0318 16:05:53.896484 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f8feb1b-5d39-4cb7-996f-dc5e34065193-inventory\") pod \"1f8feb1b-5d39-4cb7-996f-dc5e34065193\" (UID: \"1f8feb1b-5d39-4cb7-996f-dc5e34065193\") " Mar 18 16:05:53 crc kubenswrapper[4696]: I0318 16:05:53.906803 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f8feb1b-5d39-4cb7-996f-dc5e34065193-kube-api-access-mkx6h" (OuterVolumeSpecName: "kube-api-access-mkx6h") pod "1f8feb1b-5d39-4cb7-996f-dc5e34065193" (UID: "1f8feb1b-5d39-4cb7-996f-dc5e34065193"). InnerVolumeSpecName "kube-api-access-mkx6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:05:53 crc kubenswrapper[4696]: I0318 16:05:53.906899 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8feb1b-5d39-4cb7-996f-dc5e34065193-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "1f8feb1b-5d39-4cb7-996f-dc5e34065193" (UID: "1f8feb1b-5d39-4cb7-996f-dc5e34065193"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:05:53 crc kubenswrapper[4696]: I0318 16:05:53.924454 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8feb1b-5d39-4cb7-996f-dc5e34065193-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1f8feb1b-5d39-4cb7-996f-dc5e34065193" (UID: "1f8feb1b-5d39-4cb7-996f-dc5e34065193"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:05:53 crc kubenswrapper[4696]: I0318 16:05:53.925444 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8feb1b-5d39-4cb7-996f-dc5e34065193-inventory" (OuterVolumeSpecName: "inventory") pod "1f8feb1b-5d39-4cb7-996f-dc5e34065193" (UID: "1f8feb1b-5d39-4cb7-996f-dc5e34065193"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:05:53 crc kubenswrapper[4696]: I0318 16:05:53.999069 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f8feb1b-5d39-4cb7-996f-dc5e34065193-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:05:53 crc kubenswrapper[4696]: I0318 16:05:53.999109 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f8feb1b-5d39-4cb7-996f-dc5e34065193-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:05:53 crc kubenswrapper[4696]: I0318 16:05:53.999120 4696 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8feb1b-5d39-4cb7-996f-dc5e34065193-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:05:53 crc kubenswrapper[4696]: I0318 16:05:53.999132 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkx6h\" (UniqueName: \"kubernetes.io/projected/1f8feb1b-5d39-4cb7-996f-dc5e34065193-kube-api-access-mkx6h\") on node \"crc\" DevicePath \"\"" Mar 18 16:05:54 crc kubenswrapper[4696]: I0318 16:05:54.353216 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh" event={"ID":"1f8feb1b-5d39-4cb7-996f-dc5e34065193","Type":"ContainerDied","Data":"1458288e0b68e3c75e061c3c77efc19670dedddc86668a912ce45b633cade000"} Mar 18 16:05:54 crc kubenswrapper[4696]: I0318 16:05:54.353261 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1458288e0b68e3c75e061c3c77efc19670dedddc86668a912ce45b633cade000" Mar 18 16:05:54 crc kubenswrapper[4696]: I0318 16:05:54.353323 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh" Mar 18 16:05:54 crc kubenswrapper[4696]: I0318 16:05:54.433341 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4"] Mar 18 16:05:54 crc kubenswrapper[4696]: E0318 16:05:54.433848 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f8feb1b-5d39-4cb7-996f-dc5e34065193" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 16:05:54 crc kubenswrapper[4696]: I0318 16:05:54.433870 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f8feb1b-5d39-4cb7-996f-dc5e34065193" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 16:05:54 crc kubenswrapper[4696]: E0318 16:05:54.433897 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64813596-49e9-4f19-8b88-c4861b9ec490" containerName="oc" Mar 18 16:05:54 crc kubenswrapper[4696]: I0318 16:05:54.433905 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="64813596-49e9-4f19-8b88-c4861b9ec490" containerName="oc" Mar 18 16:05:54 crc kubenswrapper[4696]: I0318 16:05:54.434090 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f8feb1b-5d39-4cb7-996f-dc5e34065193" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 16:05:54 crc kubenswrapper[4696]: I0318 16:05:54.434125 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="64813596-49e9-4f19-8b88-c4861b9ec490" containerName="oc" Mar 18 16:05:54 crc kubenswrapper[4696]: I0318 16:05:54.434797 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4" Mar 18 16:05:54 crc kubenswrapper[4696]: I0318 16:05:54.437421 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:05:54 crc kubenswrapper[4696]: I0318 16:05:54.438762 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g8zsj" Mar 18 16:05:54 crc kubenswrapper[4696]: I0318 16:05:54.438962 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:05:54 crc kubenswrapper[4696]: I0318 16:05:54.438967 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:05:54 crc kubenswrapper[4696]: I0318 16:05:54.444432 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4"] Mar 18 16:05:54 crc kubenswrapper[4696]: I0318 16:05:54.610323 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wtm5\" (UniqueName: \"kubernetes.io/projected/c9ae5aa3-8f8f-4951-85ec-1b3583c90481-kube-api-access-4wtm5\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4\" (UID: \"c9ae5aa3-8f8f-4951-85ec-1b3583c90481\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4" Mar 18 16:05:54 crc kubenswrapper[4696]: I0318 16:05:54.610575 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c9ae5aa3-8f8f-4951-85ec-1b3583c90481-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4\" (UID: \"c9ae5aa3-8f8f-4951-85ec-1b3583c90481\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4" Mar 18 16:05:54 crc kubenswrapper[4696]: I0318 16:05:54.610768 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9ae5aa3-8f8f-4951-85ec-1b3583c90481-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4\" (UID: \"c9ae5aa3-8f8f-4951-85ec-1b3583c90481\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4" Mar 18 16:05:54 crc kubenswrapper[4696]: I0318 16:05:54.712617 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c9ae5aa3-8f8f-4951-85ec-1b3583c90481-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4\" (UID: \"c9ae5aa3-8f8f-4951-85ec-1b3583c90481\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4" Mar 18 16:05:54 crc kubenswrapper[4696]: I0318 16:05:54.712701 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9ae5aa3-8f8f-4951-85ec-1b3583c90481-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4\" (UID: \"c9ae5aa3-8f8f-4951-85ec-1b3583c90481\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4" Mar 18 16:05:54 crc kubenswrapper[4696]: I0318 16:05:54.712790 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wtm5\" (UniqueName: \"kubernetes.io/projected/c9ae5aa3-8f8f-4951-85ec-1b3583c90481-kube-api-access-4wtm5\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4\" (UID: \"c9ae5aa3-8f8f-4951-85ec-1b3583c90481\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4" Mar 18 16:05:54 crc kubenswrapper[4696]: I0318 16:05:54.720551 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9ae5aa3-8f8f-4951-85ec-1b3583c90481-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4\" (UID: \"c9ae5aa3-8f8f-4951-85ec-1b3583c90481\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4" Mar 18 16:05:54 crc kubenswrapper[4696]: I0318 16:05:54.724274 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c9ae5aa3-8f8f-4951-85ec-1b3583c90481-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4\" (UID: \"c9ae5aa3-8f8f-4951-85ec-1b3583c90481\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4" Mar 18 16:05:54 crc kubenswrapper[4696]: I0318 16:05:54.732965 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wtm5\" (UniqueName: \"kubernetes.io/projected/c9ae5aa3-8f8f-4951-85ec-1b3583c90481-kube-api-access-4wtm5\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4\" (UID: \"c9ae5aa3-8f8f-4951-85ec-1b3583c90481\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4" Mar 18 16:05:54 crc kubenswrapper[4696]: I0318 16:05:54.750660 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4" Mar 18 16:05:55 crc kubenswrapper[4696]: I0318 16:05:55.089457 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4"] Mar 18 16:05:55 crc kubenswrapper[4696]: I0318 16:05:55.108951 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:05:55 crc kubenswrapper[4696]: I0318 16:05:55.363142 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4" event={"ID":"c9ae5aa3-8f8f-4951-85ec-1b3583c90481","Type":"ContainerStarted","Data":"4be9f40cba9ec287cfc17f63d8f4cb0efd4156dfb5a47f76eb44566aa7177710"} Mar 18 16:05:56 crc kubenswrapper[4696]: I0318 16:05:56.372090 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4" event={"ID":"c9ae5aa3-8f8f-4951-85ec-1b3583c90481","Type":"ContainerStarted","Data":"6ab05ec84e6c281ac905da6ebca15206c43f0e1280806fe8b7237d9e1c7ff3bf"} Mar 18 16:05:56 crc kubenswrapper[4696]: I0318 16:05:56.412830 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4" podStartSLOduration=2.203140263 podStartE2EDuration="2.412809777s" podCreationTimestamp="2026-03-18 16:05:54 +0000 UTC" firstStartedPulling="2026-03-18 16:05:55.108690052 +0000 UTC m=+1798.114864258" lastFinishedPulling="2026-03-18 16:05:55.318359576 +0000 UTC m=+1798.324533772" observedRunningTime="2026-03-18 16:05:56.411787701 +0000 UTC m=+1799.417961907" watchObservedRunningTime="2026-03-18 16:05:56.412809777 +0000 UTC m=+1799.418983993" Mar 18 16:06:00 crc kubenswrapper[4696]: I0318 16:06:00.129708 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564166-dwq7v"] Mar 18 16:06:00 crc kubenswrapper[4696]: I0318 16:06:00.131423 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564166-dwq7v" Mar 18 16:06:00 crc kubenswrapper[4696]: I0318 16:06:00.133388 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:06:00 crc kubenswrapper[4696]: I0318 16:06:00.133944 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:06:00 crc kubenswrapper[4696]: I0318 16:06:00.134046 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:06:00 crc kubenswrapper[4696]: I0318 16:06:00.154796 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564166-dwq7v"] Mar 18 16:06:00 crc kubenswrapper[4696]: I0318 16:06:00.214479 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zf7w\" (UniqueName: \"kubernetes.io/projected/faafbd48-4d87-4569-ade7-f00d32a4c6a2-kube-api-access-8zf7w\") pod \"auto-csr-approver-29564166-dwq7v\" (UID: \"faafbd48-4d87-4569-ade7-f00d32a4c6a2\") " pod="openshift-infra/auto-csr-approver-29564166-dwq7v" Mar 18 16:06:00 crc kubenswrapper[4696]: I0318 16:06:00.316289 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zf7w\" (UniqueName: \"kubernetes.io/projected/faafbd48-4d87-4569-ade7-f00d32a4c6a2-kube-api-access-8zf7w\") pod \"auto-csr-approver-29564166-dwq7v\" (UID: \"faafbd48-4d87-4569-ade7-f00d32a4c6a2\") " pod="openshift-infra/auto-csr-approver-29564166-dwq7v" Mar 18 16:06:00 crc kubenswrapper[4696]: I0318 16:06:00.337057 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zf7w\" (UniqueName: \"kubernetes.io/projected/faafbd48-4d87-4569-ade7-f00d32a4c6a2-kube-api-access-8zf7w\") pod \"auto-csr-approver-29564166-dwq7v\" (UID: \"faafbd48-4d87-4569-ade7-f00d32a4c6a2\") " pod="openshift-infra/auto-csr-approver-29564166-dwq7v" Mar 18 16:06:00 crc kubenswrapper[4696]: I0318 16:06:00.471743 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564166-dwq7v" Mar 18 16:06:00 crc kubenswrapper[4696]: I0318 16:06:00.918958 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564166-dwq7v"] Mar 18 16:06:01 crc kubenswrapper[4696]: I0318 16:06:01.420322 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564166-dwq7v" event={"ID":"faafbd48-4d87-4569-ade7-f00d32a4c6a2","Type":"ContainerStarted","Data":"7d94eda0fdd56991cee20b4b00dd5c334155796e99986a0569e328869cdc5230"} Mar 18 16:06:03 crc kubenswrapper[4696]: I0318 16:06:03.442269 4696 generic.go:334] "Generic (PLEG): container finished" podID="faafbd48-4d87-4569-ade7-f00d32a4c6a2" containerID="4237901eadb94cb2b952bf5b0f3902642c324f0936efaecf4454130e7eb0819c" exitCode=0 Mar 18 16:06:03 crc kubenswrapper[4696]: I0318 16:06:03.442434 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564166-dwq7v" event={"ID":"faafbd48-4d87-4569-ade7-f00d32a4c6a2","Type":"ContainerDied","Data":"4237901eadb94cb2b952bf5b0f3902642c324f0936efaecf4454130e7eb0819c"} Mar 18 16:06:04 crc kubenswrapper[4696]: I0318 16:06:04.818124 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564166-dwq7v" Mar 18 16:06:04 crc kubenswrapper[4696]: I0318 16:06:04.896956 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zf7w\" (UniqueName: \"kubernetes.io/projected/faafbd48-4d87-4569-ade7-f00d32a4c6a2-kube-api-access-8zf7w\") pod \"faafbd48-4d87-4569-ade7-f00d32a4c6a2\" (UID: \"faafbd48-4d87-4569-ade7-f00d32a4c6a2\") " Mar 18 16:06:04 crc kubenswrapper[4696]: I0318 16:06:04.902780 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faafbd48-4d87-4569-ade7-f00d32a4c6a2-kube-api-access-8zf7w" (OuterVolumeSpecName: "kube-api-access-8zf7w") pod "faafbd48-4d87-4569-ade7-f00d32a4c6a2" (UID: "faafbd48-4d87-4569-ade7-f00d32a4c6a2"). InnerVolumeSpecName "kube-api-access-8zf7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:06:05 crc kubenswrapper[4696]: I0318 16:06:05.002258 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zf7w\" (UniqueName: \"kubernetes.io/projected/faafbd48-4d87-4569-ade7-f00d32a4c6a2-kube-api-access-8zf7w\") on node \"crc\" DevicePath \"\"" Mar 18 16:06:05 crc kubenswrapper[4696]: I0318 16:06:05.461873 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564166-dwq7v" event={"ID":"faafbd48-4d87-4569-ade7-f00d32a4c6a2","Type":"ContainerDied","Data":"7d94eda0fdd56991cee20b4b00dd5c334155796e99986a0569e328869cdc5230"} Mar 18 16:06:05 crc kubenswrapper[4696]: I0318 16:06:05.461951 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d94eda0fdd56991cee20b4b00dd5c334155796e99986a0569e328869cdc5230" Mar 18 16:06:05 crc kubenswrapper[4696]: I0318 16:06:05.461909 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564166-dwq7v" Mar 18 16:06:05 crc kubenswrapper[4696]: I0318 16:06:05.597320 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:06:05 crc kubenswrapper[4696]: E0318 16:06:05.597922 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:06:05 crc kubenswrapper[4696]: I0318 16:06:05.890023 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564160-sfqqd"] Mar 18 16:06:05 crc kubenswrapper[4696]: I0318 16:06:05.900843 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564160-sfqqd"] Mar 18 16:06:07 crc kubenswrapper[4696]: I0318 16:06:07.610013 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74ca80c8-a4c1-4f87-9e78-5648ca013164" path="/var/lib/kubelet/pods/74ca80c8-a4c1-4f87-9e78-5648ca013164/volumes" Mar 18 16:06:09 crc kubenswrapper[4696]: I0318 16:06:09.030901 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-f8nj6"] Mar 18 16:06:09 crc kubenswrapper[4696]: I0318 16:06:09.040125 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-161d-account-create-update-cw9fd"] Mar 18 16:06:09 crc kubenswrapper[4696]: I0318 16:06:09.049189 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-f8nj6"] Mar 18 16:06:09 crc kubenswrapper[4696]: I0318 16:06:09.056830 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-161d-account-create-update-cw9fd"] Mar 18 16:06:09 crc kubenswrapper[4696]: I0318 16:06:09.615482 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2662d61f-9289-4c66-8823-4bb09d86dd75" path="/var/lib/kubelet/pods/2662d61f-9289-4c66-8823-4bb09d86dd75/volumes" Mar 18 16:06:09 crc kubenswrapper[4696]: I0318 16:06:09.616253 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7" path="/var/lib/kubelet/pods/4d2d55c5-26d3-4b9c-9fb4-8a56baf576b7/volumes" Mar 18 16:06:10 crc kubenswrapper[4696]: I0318 16:06:10.042803 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-th84q"] Mar 18 16:06:10 crc kubenswrapper[4696]: I0318 16:06:10.055652 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d8de-account-create-update-r4hss"] Mar 18 16:06:10 crc kubenswrapper[4696]: I0318 16:06:10.073931 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-th84q"] Mar 18 16:06:10 crc kubenswrapper[4696]: I0318 16:06:10.083138 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-d8de-account-create-update-r4hss"] Mar 18 16:06:10 crc kubenswrapper[4696]: I0318 16:06:10.091743 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-effc-account-create-update-phf9v"] Mar 18 16:06:10 crc kubenswrapper[4696]: I0318 16:06:10.099401 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-jgf82"] Mar 18 16:06:10 crc kubenswrapper[4696]: I0318 16:06:10.106176 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-effc-account-create-update-phf9v"] Mar 18 16:06:10 crc kubenswrapper[4696]: I0318 16:06:10.113556 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-jgf82"] Mar 18 16:06:11 crc kubenswrapper[4696]: I0318 16:06:11.609702 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a302450-56dc-4388-8796-f657954f0e25" path="/var/lib/kubelet/pods/8a302450-56dc-4388-8796-f657954f0e25/volumes" Mar 18 16:06:11 crc kubenswrapper[4696]: I0318 16:06:11.610898 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90c15c3a-cbbf-4ebf-b594-1782495f18db" path="/var/lib/kubelet/pods/90c15c3a-cbbf-4ebf-b594-1782495f18db/volumes" Mar 18 16:06:11 crc kubenswrapper[4696]: I0318 16:06:11.611494 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6d2b900-71d4-4dfd-bfda-07d44f39ee48" path="/var/lib/kubelet/pods/a6d2b900-71d4-4dfd-bfda-07d44f39ee48/volumes" Mar 18 16:06:11 crc kubenswrapper[4696]: I0318 16:06:11.612151 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0ad9bfa-558d-440b-9297-c145b93193c2" path="/var/lib/kubelet/pods/b0ad9bfa-558d-440b-9297-c145b93193c2/volumes" Mar 18 16:06:17 crc kubenswrapper[4696]: I0318 16:06:17.604752 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:06:17 crc kubenswrapper[4696]: E0318 16:06:17.620051 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:06:20 crc kubenswrapper[4696]: I0318 16:06:20.518238 4696 scope.go:117] "RemoveContainer" containerID="007251e633b476c0d4d6e5a3beeff4f57dbfe03715b5d0c6ee78022b6c342c9b" Mar 18 16:06:20 crc kubenswrapper[4696]: I0318 16:06:20.552345 4696 scope.go:117] "RemoveContainer" containerID="a31eb740c98c11bd1dbbe21b974e9398871d2793161b81fef1ff3d52d6c553f9" Mar 18 16:06:20 crc kubenswrapper[4696]: I0318 16:06:20.641255 4696 scope.go:117] "RemoveContainer" containerID="3774ac525944a9276f9bf8db749c93c0b38edc378c7ed2aeaa398d6922445351" Mar 18 16:06:20 crc kubenswrapper[4696]: I0318 16:06:20.672402 4696 scope.go:117] "RemoveContainer" containerID="965d06eab2335d06b80091056473ffd23c9252c3b119f1dd46631d4cbce43b45" Mar 18 16:06:20 crc kubenswrapper[4696]: I0318 16:06:20.723339 4696 scope.go:117] "RemoveContainer" containerID="d19778f8475cb901076d1c303b9beb3e0d8935535e7ba653d9a58ea6b63a2908" Mar 18 16:06:20 crc kubenswrapper[4696]: I0318 16:06:20.785974 4696 scope.go:117] "RemoveContainer" containerID="0d6c20723944e6fcaf0a2c7c5c4d724f650152b99555979b8c399aac608f67ab" Mar 18 16:06:20 crc kubenswrapper[4696]: I0318 16:06:20.833275 4696 scope.go:117] "RemoveContainer" containerID="3dadb71be7e3477427169d4d6cf08f73bc5befa344475f97d547fa61533071d3" Mar 18 16:06:20 crc kubenswrapper[4696]: I0318 16:06:20.855402 4696 scope.go:117] "RemoveContainer" containerID="a5995040a71b2e41a26358367386a0bcac58f1a210932ca456db70d802245f37" Mar 18 16:06:30 crc kubenswrapper[4696]: I0318 16:06:30.042384 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-g666r"] Mar 18 16:06:30 crc kubenswrapper[4696]: I0318 16:06:30.052180 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-g666r"] Mar 18 16:06:31 crc kubenswrapper[4696]: I0318 16:06:31.608014 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78ecc605-5dea-4131-9161-7adf4ba2db45" path="/var/lib/kubelet/pods/78ecc605-5dea-4131-9161-7adf4ba2db45/volumes" Mar 18 16:06:32 crc kubenswrapper[4696]: I0318 16:06:32.597099 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:06:32 crc kubenswrapper[4696]: E0318 16:06:32.597724 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:06:39 crc kubenswrapper[4696]: I0318 16:06:39.050447 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-z72n4"] Mar 18 16:06:39 crc kubenswrapper[4696]: I0318 16:06:39.062557 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-97fc-account-create-update-xgqkx"] Mar 18 16:06:39 crc kubenswrapper[4696]: I0318 16:06:39.071332 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-pbhp6"] Mar 18 16:06:39 crc kubenswrapper[4696]: I0318 16:06:39.095118 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-0180-account-create-update-dk749"] Mar 18 16:06:39 crc kubenswrapper[4696]: I0318 16:06:39.103992 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-12ff-account-create-update-mqz6l"] Mar 18 16:06:39 crc kubenswrapper[4696]: I0318 16:06:39.111216 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-pbhp6"] Mar 18 16:06:39 crc kubenswrapper[4696]: I0318 16:06:39.119577 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-12ff-account-create-update-mqz6l"] Mar 18 16:06:39 crc kubenswrapper[4696]: I0318 16:06:39.126630 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-97fc-account-create-update-xgqkx"] Mar 18 16:06:39 crc kubenswrapper[4696]: I0318 16:06:39.135205 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-0180-account-create-update-dk749"] Mar 18 16:06:39 crc kubenswrapper[4696]: I0318 16:06:39.142713 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-z72n4"] Mar 18 16:06:39 crc kubenswrapper[4696]: I0318 16:06:39.608233 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0788afc4-c079-463b-8d56-3a6be70dbf51" path="/var/lib/kubelet/pods/0788afc4-c079-463b-8d56-3a6be70dbf51/volumes" Mar 18 16:06:39 crc kubenswrapper[4696]: I0318 16:06:39.609079 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22b3c462-44a9-4899-a39b-463eda7dd5d0" path="/var/lib/kubelet/pods/22b3c462-44a9-4899-a39b-463eda7dd5d0/volumes" Mar 18 16:06:39 crc kubenswrapper[4696]: I0318 16:06:39.609806 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="370a3d71-3446-40b6-80ed-7efdb61a36a5" path="/var/lib/kubelet/pods/370a3d71-3446-40b6-80ed-7efdb61a36a5/volumes" Mar 18 16:06:39 crc kubenswrapper[4696]: I0318 16:06:39.610455 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c" path="/var/lib/kubelet/pods/7fcca3cf-ba6e-4bc3-a2f9-76d646b9ad9c/volumes" Mar 18 16:06:39 crc kubenswrapper[4696]: I0318 16:06:39.611484 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb0c55b7-1488-4ae0-8d27-d063761edde5" path="/var/lib/kubelet/pods/cb0c55b7-1488-4ae0-8d27-d063761edde5/volumes" Mar 18 16:06:42 crc kubenswrapper[4696]: I0318 16:06:42.032603 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-twrd5"] Mar 18 16:06:42 crc kubenswrapper[4696]: I0318 16:06:42.043506 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-twrd5"] Mar 18 16:06:43 crc kubenswrapper[4696]: I0318 16:06:43.603012 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:06:43 crc kubenswrapper[4696]: E0318 16:06:43.604875 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:06:43 crc kubenswrapper[4696]: I0318 16:06:43.632500 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b834ff85-81d1-4a20-9f59-0790a7492dfc" path="/var/lib/kubelet/pods/b834ff85-81d1-4a20-9f59-0790a7492dfc/volumes" Mar 18 16:06:46 crc kubenswrapper[4696]: I0318 16:06:46.030171 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-9jmt6"] Mar 18 16:06:46 crc kubenswrapper[4696]: I0318 16:06:46.040095 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-9jmt6"] Mar 18 16:06:47 crc kubenswrapper[4696]: I0318 16:06:47.607877 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6c5159e-3018-4c1b-8f1c-b40e157d043b" path="/var/lib/kubelet/pods/c6c5159e-3018-4c1b-8f1c-b40e157d043b/volumes" Mar 18 16:06:58 crc kubenswrapper[4696]: I0318 16:06:58.597383 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:06:58 crc kubenswrapper[4696]: E0318 16:06:58.598353 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:07:06 crc kubenswrapper[4696]: I0318 16:07:06.044975 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-sfb9t"] Mar 18 16:07:06 crc kubenswrapper[4696]: I0318 16:07:06.053887 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-sfb9t"] Mar 18 16:07:07 crc kubenswrapper[4696]: I0318 16:07:07.610629 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cfc5851-8295-4c4d-8cb4-4c18f9827227" path="/var/lib/kubelet/pods/6cfc5851-8295-4c4d-8cb4-4c18f9827227/volumes" Mar 18 16:07:11 crc kubenswrapper[4696]: I0318 16:07:11.598220 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:07:11 crc kubenswrapper[4696]: E0318 16:07:11.598938 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:07:19 crc kubenswrapper[4696]: I0318 16:07:19.044675 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-h94vl"] Mar 18 16:07:19 crc kubenswrapper[4696]: I0318 16:07:19.056323 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-h94vl"] Mar 18 16:07:19 crc kubenswrapper[4696]: I0318 16:07:19.618115 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e07ce44-d586-4f16-952b-5e1eb3b0cfa6" path="/var/lib/kubelet/pods/0e07ce44-d586-4f16-952b-5e1eb3b0cfa6/volumes" Mar 18 16:07:21 crc kubenswrapper[4696]: I0318 16:07:21.021973 4696 scope.go:117] "RemoveContainer" containerID="5422047cd85de969300b22e3f45d67a48493b53b10f136aeee116bac24add09a" Mar 18 16:07:21 crc kubenswrapper[4696]: I0318 16:07:21.048669 4696 scope.go:117] "RemoveContainer" containerID="6ced00f98ca046024dae8e9ce6bdae0071fa5ae95596ed961d36eb7f76d42250" Mar 18 16:07:21 crc kubenswrapper[4696]: I0318 16:07:21.110837 4696 scope.go:117] "RemoveContainer" containerID="c8708acc6cf893d512b0b488bdc35b0d9de8fe29b6fcdbf4bad6511fe1dcb787" Mar 18 16:07:21 crc kubenswrapper[4696]: I0318 16:07:21.152377 4696 scope.go:117] "RemoveContainer" containerID="c7e497d3a38b51966753a5b7a8887757487d9addbb5506013a45052362a2a504" Mar 18 16:07:21 crc kubenswrapper[4696]: I0318 16:07:21.188373 4696 scope.go:117] "RemoveContainer" containerID="178a3592ec78d7aebcb36d5c68a0222e89f924f5d132570b1f4ab19d655171a6" Mar 18 16:07:21 crc kubenswrapper[4696]: I0318 16:07:21.223504 4696 scope.go:117] "RemoveContainer" containerID="b99d4f81a0eb301f1bf9d22f4abef1b7e592e7d87e5c4dd546f2037b2f8a9321" Mar 18 16:07:21 crc kubenswrapper[4696]: I0318 16:07:21.284714 4696 scope.go:117] "RemoveContainer" containerID="27865e2af2387aeadd20105cbe4941b24c3d43b80e358bf96cc8c0de2ad19f8d" Mar 18 16:07:21 crc kubenswrapper[4696]: I0318 16:07:21.306129 4696 scope.go:117] "RemoveContainer" containerID="baa59b49b2d04c27eac13aa0671f98810ecefa397af8a9a414f1ca6baa5a8367" Mar 18 16:07:21 crc kubenswrapper[4696]: I0318 16:07:21.340499 4696 scope.go:117] "RemoveContainer" containerID="e358bac5c31b29168432e7202220daa7e2674d485b8ecfa9f8e1a2f5a6fb25f1" Mar 18 16:07:21 crc kubenswrapper[4696]: I0318 16:07:21.361677 4696 scope.go:117] "RemoveContainer" containerID="857d54e8e1a459e51f8e42f236e58dd773b4b9b1cb4bc6ce7c39f8b55bc22fd4" Mar 18 16:07:21 crc kubenswrapper[4696]: I0318 16:07:21.393688 4696 scope.go:117] "RemoveContainer" containerID="fc978c83701e43ec969e0cf94165ae2aacd2fcac4bf6931bdf030d1808b2cd67" Mar 18 16:07:23 crc kubenswrapper[4696]: I0318 16:07:23.597208 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:07:23 crc kubenswrapper[4696]: E0318 16:07:23.597734 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:07:28 crc kubenswrapper[4696]: I0318 16:07:28.234572 4696 generic.go:334] "Generic (PLEG): container finished" podID="c9ae5aa3-8f8f-4951-85ec-1b3583c90481" containerID="6ab05ec84e6c281ac905da6ebca15206c43f0e1280806fe8b7237d9e1c7ff3bf" exitCode=0 Mar 18 16:07:28 crc kubenswrapper[4696]: I0318 16:07:28.234632 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4" event={"ID":"c9ae5aa3-8f8f-4951-85ec-1b3583c90481","Type":"ContainerDied","Data":"6ab05ec84e6c281ac905da6ebca15206c43f0e1280806fe8b7237d9e1c7ff3bf"} Mar 18 16:07:29 crc kubenswrapper[4696]: I0318 16:07:29.049078 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-jvqtp"] Mar 18 16:07:29 crc kubenswrapper[4696]: I0318 16:07:29.060429 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-zfkzp"] Mar 18 16:07:29 crc kubenswrapper[4696]: I0318 16:07:29.072148 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-zfkzp"] Mar 18 16:07:29 crc kubenswrapper[4696]: I0318 16:07:29.083795 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-jvqtp"] Mar 18 16:07:29 crc kubenswrapper[4696]: I0318 16:07:29.607458 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="319d79c7-8160-4b57-9b13-3797a015cbdf" path="/var/lib/kubelet/pods/319d79c7-8160-4b57-9b13-3797a015cbdf/volumes" Mar 18 16:07:29 crc kubenswrapper[4696]: I0318 16:07:29.608337 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f27b4c3-3df4-4f88-9bf1-b0f4c242d617" path="/var/lib/kubelet/pods/4f27b4c3-3df4-4f88-9bf1-b0f4c242d617/volumes" Mar 18 16:07:29 crc kubenswrapper[4696]: I0318 16:07:29.645134 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4" Mar 18 16:07:29 crc kubenswrapper[4696]: I0318 16:07:29.679451 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c9ae5aa3-8f8f-4951-85ec-1b3583c90481-ssh-key-openstack-edpm-ipam\") pod \"c9ae5aa3-8f8f-4951-85ec-1b3583c90481\" (UID: \"c9ae5aa3-8f8f-4951-85ec-1b3583c90481\") " Mar 18 16:07:29 crc kubenswrapper[4696]: I0318 16:07:29.679547 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wtm5\" (UniqueName: \"kubernetes.io/projected/c9ae5aa3-8f8f-4951-85ec-1b3583c90481-kube-api-access-4wtm5\") pod \"c9ae5aa3-8f8f-4951-85ec-1b3583c90481\" (UID: \"c9ae5aa3-8f8f-4951-85ec-1b3583c90481\") " Mar 18 16:07:29 crc kubenswrapper[4696]: I0318 16:07:29.679592 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9ae5aa3-8f8f-4951-85ec-1b3583c90481-inventory\") pod \"c9ae5aa3-8f8f-4951-85ec-1b3583c90481\" (UID: \"c9ae5aa3-8f8f-4951-85ec-1b3583c90481\") " Mar 18 16:07:29 crc kubenswrapper[4696]: I0318 16:07:29.685743 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9ae5aa3-8f8f-4951-85ec-1b3583c90481-kube-api-access-4wtm5" (OuterVolumeSpecName: "kube-api-access-4wtm5") pod "c9ae5aa3-8f8f-4951-85ec-1b3583c90481" (UID: "c9ae5aa3-8f8f-4951-85ec-1b3583c90481"). InnerVolumeSpecName "kube-api-access-4wtm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:07:29 crc kubenswrapper[4696]: I0318 16:07:29.707911 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9ae5aa3-8f8f-4951-85ec-1b3583c90481-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c9ae5aa3-8f8f-4951-85ec-1b3583c90481" (UID: "c9ae5aa3-8f8f-4951-85ec-1b3583c90481"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:07:29 crc kubenswrapper[4696]: I0318 16:07:29.711926 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9ae5aa3-8f8f-4951-85ec-1b3583c90481-inventory" (OuterVolumeSpecName: "inventory") pod "c9ae5aa3-8f8f-4951-85ec-1b3583c90481" (UID: "c9ae5aa3-8f8f-4951-85ec-1b3583c90481"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:07:29 crc kubenswrapper[4696]: I0318 16:07:29.781616 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c9ae5aa3-8f8f-4951-85ec-1b3583c90481-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:07:29 crc kubenswrapper[4696]: I0318 16:07:29.781655 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wtm5\" (UniqueName: \"kubernetes.io/projected/c9ae5aa3-8f8f-4951-85ec-1b3583c90481-kube-api-access-4wtm5\") on node \"crc\" DevicePath \"\"" Mar 18 16:07:29 crc kubenswrapper[4696]: I0318 16:07:29.781665 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9ae5aa3-8f8f-4951-85ec-1b3583c90481-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:07:30 crc kubenswrapper[4696]: I0318 16:07:30.250225 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4" event={"ID":"c9ae5aa3-8f8f-4951-85ec-1b3583c90481","Type":"ContainerDied","Data":"4be9f40cba9ec287cfc17f63d8f4cb0efd4156dfb5a47f76eb44566aa7177710"} Mar 18 16:07:30 crc kubenswrapper[4696]: I0318 16:07:30.250551 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4be9f40cba9ec287cfc17f63d8f4cb0efd4156dfb5a47f76eb44566aa7177710" Mar 18 16:07:30 crc kubenswrapper[4696]: I0318 16:07:30.250287 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4" Mar 18 16:07:30 crc kubenswrapper[4696]: I0318 16:07:30.378168 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb"] Mar 18 16:07:30 crc kubenswrapper[4696]: E0318 16:07:30.378711 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ae5aa3-8f8f-4951-85ec-1b3583c90481" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 18 16:07:30 crc kubenswrapper[4696]: I0318 16:07:30.378734 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ae5aa3-8f8f-4951-85ec-1b3583c90481" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 18 16:07:30 crc kubenswrapper[4696]: E0318 16:07:30.378752 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faafbd48-4d87-4569-ade7-f00d32a4c6a2" containerName="oc" Mar 18 16:07:30 crc kubenswrapper[4696]: I0318 16:07:30.378761 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="faafbd48-4d87-4569-ade7-f00d32a4c6a2" containerName="oc" Mar 18 16:07:30 crc kubenswrapper[4696]: I0318 16:07:30.378994 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ae5aa3-8f8f-4951-85ec-1b3583c90481" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 18 16:07:30 crc kubenswrapper[4696]: I0318 16:07:30.379033 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="faafbd48-4d87-4569-ade7-f00d32a4c6a2" containerName="oc" Mar 18 16:07:30 crc kubenswrapper[4696]: I0318 16:07:30.379838 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb" Mar 18 16:07:30 crc kubenswrapper[4696]: I0318 16:07:30.382411 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:07:30 crc kubenswrapper[4696]: I0318 16:07:30.382929 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:07:30 crc kubenswrapper[4696]: I0318 16:07:30.383077 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g8zsj" Mar 18 16:07:30 crc kubenswrapper[4696]: I0318 16:07:30.385338 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:07:30 crc kubenswrapper[4696]: I0318 16:07:30.387231 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb"] Mar 18 16:07:30 crc kubenswrapper[4696]: I0318 16:07:30.498650 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p95hg\" (UniqueName: \"kubernetes.io/projected/f2c235ca-a193-47df-8495-600e7c8eea37-kube-api-access-p95hg\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb\" (UID: \"f2c235ca-a193-47df-8495-600e7c8eea37\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb" Mar 18 16:07:30 crc kubenswrapper[4696]: I0318 16:07:30.498763 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2c235ca-a193-47df-8495-600e7c8eea37-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb\" (UID: \"f2c235ca-a193-47df-8495-600e7c8eea37\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb" Mar 18 16:07:30 crc kubenswrapper[4696]: I0318 16:07:30.498797 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2c235ca-a193-47df-8495-600e7c8eea37-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb\" (UID: \"f2c235ca-a193-47df-8495-600e7c8eea37\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb" Mar 18 16:07:30 crc kubenswrapper[4696]: I0318 16:07:30.600440 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p95hg\" (UniqueName: \"kubernetes.io/projected/f2c235ca-a193-47df-8495-600e7c8eea37-kube-api-access-p95hg\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb\" (UID: \"f2c235ca-a193-47df-8495-600e7c8eea37\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb" Mar 18 16:07:30 crc kubenswrapper[4696]: I0318 16:07:30.600507 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2c235ca-a193-47df-8495-600e7c8eea37-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb\" (UID: \"f2c235ca-a193-47df-8495-600e7c8eea37\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb" Mar 18 16:07:30 crc kubenswrapper[4696]: I0318 16:07:30.600539 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2c235ca-a193-47df-8495-600e7c8eea37-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb\" (UID: \"f2c235ca-a193-47df-8495-600e7c8eea37\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb" Mar 18 16:07:30 crc kubenswrapper[4696]: I0318 16:07:30.604592 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2c235ca-a193-47df-8495-600e7c8eea37-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb\" (UID: \"f2c235ca-a193-47df-8495-600e7c8eea37\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb" Mar 18 16:07:30 crc kubenswrapper[4696]: I0318 16:07:30.605494 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2c235ca-a193-47df-8495-600e7c8eea37-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb\" (UID: \"f2c235ca-a193-47df-8495-600e7c8eea37\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb" Mar 18 16:07:30 crc kubenswrapper[4696]: I0318 16:07:30.616581 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p95hg\" (UniqueName: \"kubernetes.io/projected/f2c235ca-a193-47df-8495-600e7c8eea37-kube-api-access-p95hg\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb\" (UID: \"f2c235ca-a193-47df-8495-600e7c8eea37\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb" Mar 18 16:07:30 crc kubenswrapper[4696]: I0318 16:07:30.699583 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb" Mar 18 16:07:31 crc kubenswrapper[4696]: I0318 16:07:31.050762 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-fxg9t"] Mar 18 16:07:31 crc kubenswrapper[4696]: I0318 16:07:31.063701 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-fxg9t"] Mar 18 16:07:31 crc kubenswrapper[4696]: I0318 16:07:31.208650 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb"] Mar 18 16:07:31 crc kubenswrapper[4696]: I0318 16:07:31.264945 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb" event={"ID":"f2c235ca-a193-47df-8495-600e7c8eea37","Type":"ContainerStarted","Data":"5ed432ecc12a852b0082a6132b99a654b060194572aadcb5eead0f2d084fcf8d"} Mar 18 16:07:31 crc kubenswrapper[4696]: I0318 16:07:31.612633 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="527c444b-3209-4c1e-addb-ed9404ab8efd" path="/var/lib/kubelet/pods/527c444b-3209-4c1e-addb-ed9404ab8efd/volumes" Mar 18 16:07:32 crc kubenswrapper[4696]: I0318 16:07:32.278257 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb" event={"ID":"f2c235ca-a193-47df-8495-600e7c8eea37","Type":"ContainerStarted","Data":"ad103f1b8ecc9096f697b6819e57391e0f730a1cfd9871563c717dcbc7273312"} Mar 18 16:07:32 crc kubenswrapper[4696]: I0318 16:07:32.300437 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb" podStartSLOduration=1.867925048 podStartE2EDuration="2.300420814s" podCreationTimestamp="2026-03-18 16:07:30 +0000 UTC" firstStartedPulling="2026-03-18 16:07:31.211211192 +0000 UTC m=+1894.217385398" lastFinishedPulling="2026-03-18 16:07:31.643706958 +0000 UTC m=+1894.649881164" observedRunningTime="2026-03-18 16:07:32.29748769 +0000 UTC m=+1895.303661896" watchObservedRunningTime="2026-03-18 16:07:32.300420814 +0000 UTC m=+1895.306595020" Mar 18 16:07:36 crc kubenswrapper[4696]: I0318 16:07:36.597378 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:07:36 crc kubenswrapper[4696]: E0318 16:07:36.599880 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:07:43 crc kubenswrapper[4696]: I0318 16:07:43.037174 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-v2tzn"] Mar 18 16:07:43 crc kubenswrapper[4696]: I0318 16:07:43.049976 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-v2tzn"] Mar 18 16:07:43 crc kubenswrapper[4696]: I0318 16:07:43.609950 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a76866d-35bf-4dee-8fc4-a5c018e9edce" path="/var/lib/kubelet/pods/4a76866d-35bf-4dee-8fc4-a5c018e9edce/volumes" Mar 18 16:07:50 crc kubenswrapper[4696]: I0318 16:07:50.597730 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:07:50 crc kubenswrapper[4696]: E0318 16:07:50.599427 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:08:00 crc kubenswrapper[4696]: I0318 16:08:00.140053 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564168-46r9b"] Mar 18 16:08:00 crc kubenswrapper[4696]: I0318 16:08:00.142788 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564168-46r9b" Mar 18 16:08:00 crc kubenswrapper[4696]: I0318 16:08:00.145153 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:08:00 crc kubenswrapper[4696]: I0318 16:08:00.145772 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:08:00 crc kubenswrapper[4696]: I0318 16:08:00.152831 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:08:00 crc kubenswrapper[4696]: I0318 16:08:00.162826 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564168-46r9b"] Mar 18 16:08:00 crc kubenswrapper[4696]: I0318 16:08:00.203181 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld9mz\" (UniqueName: \"kubernetes.io/projected/f401c295-b79a-4193-a1a7-f40d9b42a96a-kube-api-access-ld9mz\") pod \"auto-csr-approver-29564168-46r9b\" (UID: \"f401c295-b79a-4193-a1a7-f40d9b42a96a\") " pod="openshift-infra/auto-csr-approver-29564168-46r9b" Mar 18 16:08:00 crc kubenswrapper[4696]: I0318 16:08:00.305448 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld9mz\" (UniqueName: \"kubernetes.io/projected/f401c295-b79a-4193-a1a7-f40d9b42a96a-kube-api-access-ld9mz\") pod \"auto-csr-approver-29564168-46r9b\" (UID: \"f401c295-b79a-4193-a1a7-f40d9b42a96a\") " pod="openshift-infra/auto-csr-approver-29564168-46r9b" Mar 18 16:08:00 crc kubenswrapper[4696]: I0318 16:08:00.327422 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld9mz\" (UniqueName: \"kubernetes.io/projected/f401c295-b79a-4193-a1a7-f40d9b42a96a-kube-api-access-ld9mz\") pod \"auto-csr-approver-29564168-46r9b\" (UID: \"f401c295-b79a-4193-a1a7-f40d9b42a96a\") " pod="openshift-infra/auto-csr-approver-29564168-46r9b" Mar 18 16:08:00 crc kubenswrapper[4696]: I0318 16:08:00.472477 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564168-46r9b" Mar 18 16:08:00 crc kubenswrapper[4696]: I0318 16:08:00.897827 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564168-46r9b"] Mar 18 16:08:01 crc kubenswrapper[4696]: I0318 16:08:01.555202 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564168-46r9b" event={"ID":"f401c295-b79a-4193-a1a7-f40d9b42a96a","Type":"ContainerStarted","Data":"374a44ca36575c25f06f9b5cf88acc5ef7898dd4d09e786ca235985681c6a4b5"} Mar 18 16:08:02 crc kubenswrapper[4696]: I0318 16:08:02.570236 4696 generic.go:334] "Generic (PLEG): container finished" podID="f401c295-b79a-4193-a1a7-f40d9b42a96a" containerID="686791b4014df2b99fd1a3edc3c6f046c87e536e7aa1d1f1d3f5e057a1771481" exitCode=0 Mar 18 16:08:02 crc kubenswrapper[4696]: I0318 16:08:02.570322 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564168-46r9b" event={"ID":"f401c295-b79a-4193-a1a7-f40d9b42a96a","Type":"ContainerDied","Data":"686791b4014df2b99fd1a3edc3c6f046c87e536e7aa1d1f1d3f5e057a1771481"} Mar 18 16:08:02 crc kubenswrapper[4696]: I0318 16:08:02.597541 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:08:02 crc kubenswrapper[4696]: E0318 16:08:02.598003 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:08:03 crc kubenswrapper[4696]: I0318 16:08:03.948482 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564168-46r9b" Mar 18 16:08:04 crc kubenswrapper[4696]: I0318 16:08:04.075484 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld9mz\" (UniqueName: \"kubernetes.io/projected/f401c295-b79a-4193-a1a7-f40d9b42a96a-kube-api-access-ld9mz\") pod \"f401c295-b79a-4193-a1a7-f40d9b42a96a\" (UID: \"f401c295-b79a-4193-a1a7-f40d9b42a96a\") " Mar 18 16:08:04 crc kubenswrapper[4696]: I0318 16:08:04.082843 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f401c295-b79a-4193-a1a7-f40d9b42a96a-kube-api-access-ld9mz" (OuterVolumeSpecName: "kube-api-access-ld9mz") pod "f401c295-b79a-4193-a1a7-f40d9b42a96a" (UID: "f401c295-b79a-4193-a1a7-f40d9b42a96a"). InnerVolumeSpecName "kube-api-access-ld9mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:08:04 crc kubenswrapper[4696]: I0318 16:08:04.178385 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld9mz\" (UniqueName: \"kubernetes.io/projected/f401c295-b79a-4193-a1a7-f40d9b42a96a-kube-api-access-ld9mz\") on node \"crc\" DevicePath \"\"" Mar 18 16:08:04 crc kubenswrapper[4696]: I0318 16:08:04.593548 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564168-46r9b" event={"ID":"f401c295-b79a-4193-a1a7-f40d9b42a96a","Type":"ContainerDied","Data":"374a44ca36575c25f06f9b5cf88acc5ef7898dd4d09e786ca235985681c6a4b5"} Mar 18 16:08:04 crc kubenswrapper[4696]: I0318 16:08:04.593588 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="374a44ca36575c25f06f9b5cf88acc5ef7898dd4d09e786ca235985681c6a4b5" Mar 18 16:08:04 crc kubenswrapper[4696]: I0318 16:08:04.593615 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564168-46r9b" Mar 18 16:08:05 crc kubenswrapper[4696]: I0318 16:08:05.008838 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564162-mdws6"] Mar 18 16:08:05 crc kubenswrapper[4696]: I0318 16:08:05.016603 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564162-mdws6"] Mar 18 16:08:05 crc kubenswrapper[4696]: I0318 16:08:05.609614 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c3a1920-bf4b-4b2c-a0df-889337ff9f2e" path="/var/lib/kubelet/pods/5c3a1920-bf4b-4b2c-a0df-889337ff9f2e/volumes" Mar 18 16:08:16 crc kubenswrapper[4696]: I0318 16:08:16.597176 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:08:17 crc kubenswrapper[4696]: I0318 16:08:17.723812 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerStarted","Data":"418d22c717d701d8c2458f1bdfe29acc3f5066a0a4b0d71d5979be21ee6f29d7"} Mar 18 16:08:21 crc kubenswrapper[4696]: I0318 16:08:21.600739 4696 scope.go:117] "RemoveContainer" containerID="58cc05b26e706e9f4326b89ac51b0d5c084fb76d960d336bc6aa34f3b91da54b" Mar 18 16:08:21 crc kubenswrapper[4696]: I0318 16:08:21.648452 4696 scope.go:117] "RemoveContainer" containerID="c67eb1bc03daa547bc5db788dc359da158c9646c80c82286bcd12bd859ba670f" Mar 18 16:08:21 crc kubenswrapper[4696]: I0318 16:08:21.692963 4696 scope.go:117] "RemoveContainer" containerID="a2126385fe875ffff376a0d9a679939b4e646644b2caacd676f6cbcd88de07f6" Mar 18 16:08:21 crc kubenswrapper[4696]: I0318 16:08:21.742497 4696 scope.go:117] "RemoveContainer" containerID="c1de6795c4da3f0414c9c4ed7fa8707ce5a74ed14795eb5ac72927b8f0cea52d" Mar 18 16:08:21 crc kubenswrapper[4696]: I0318 16:08:21.807606 4696 scope.go:117] "RemoveContainer" containerID="b33330338b64f9ea3a53789b59a27ef3b0fc2d23861c97d78add74615715ec20" Mar 18 16:08:24 crc kubenswrapper[4696]: I0318 16:08:24.054869 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-4qnw5"] Mar 18 16:08:24 crc kubenswrapper[4696]: I0318 16:08:24.070210 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-xpnvg"] Mar 18 16:08:24 crc kubenswrapper[4696]: I0318 16:08:24.079845 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-7121-account-create-update-45gq6"] Mar 18 16:08:24 crc kubenswrapper[4696]: I0318 16:08:24.088316 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-726d-account-create-update-c8c2t"] Mar 18 16:08:24 crc kubenswrapper[4696]: I0318 16:08:24.095361 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-4qnw5"] Mar 18 16:08:24 crc kubenswrapper[4696]: I0318 16:08:24.102338 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-726d-account-create-update-c8c2t"] Mar 18 16:08:24 crc kubenswrapper[4696]: I0318 16:08:24.109884 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-t9mvs"] Mar 18 16:08:24 crc kubenswrapper[4696]: I0318 16:08:24.117317 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-7121-account-create-update-45gq6"] Mar 18 16:08:24 crc kubenswrapper[4696]: I0318 16:08:24.125549 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-xpnvg"] Mar 18 16:08:24 crc kubenswrapper[4696]: I0318 16:08:24.133513 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-t9mvs"] Mar 18 16:08:25 crc kubenswrapper[4696]: I0318 16:08:25.027089 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-e7be-account-create-update-v2pr7"] Mar 18 16:08:25 crc kubenswrapper[4696]: I0318 16:08:25.034373 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-e7be-account-create-update-v2pr7"] Mar 18 16:08:25 crc kubenswrapper[4696]: I0318 16:08:25.612174 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="159d7aa7-449a-4e20-940c-f8c6f6fc88f0" path="/var/lib/kubelet/pods/159d7aa7-449a-4e20-940c-f8c6f6fc88f0/volumes" Mar 18 16:08:25 crc kubenswrapper[4696]: I0318 16:08:25.613809 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="251f8d1d-fe68-479f-9af1-ec8c288b3524" path="/var/lib/kubelet/pods/251f8d1d-fe68-479f-9af1-ec8c288b3524/volumes" Mar 18 16:08:25 crc kubenswrapper[4696]: I0318 16:08:25.615287 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b054f24-437a-4095-b1c4-c7e20b1e68d5" path="/var/lib/kubelet/pods/2b054f24-437a-4095-b1c4-c7e20b1e68d5/volumes" Mar 18 16:08:25 crc kubenswrapper[4696]: I0318 16:08:25.617319 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a539865a-1121-432b-909b-617f805460d9" path="/var/lib/kubelet/pods/a539865a-1121-432b-909b-617f805460d9/volumes" Mar 18 16:08:25 crc kubenswrapper[4696]: I0318 16:08:25.619124 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1ce5a9d-83ad-459d-b149-25d0760e14e2" path="/var/lib/kubelet/pods/b1ce5a9d-83ad-459d-b149-25d0760e14e2/volumes" Mar 18 16:08:25 crc kubenswrapper[4696]: I0318 16:08:25.620020 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3" path="/var/lib/kubelet/pods/f2a154bb-e9c5-44ce-bf27-7bdf87c86ea3/volumes" Mar 18 16:08:39 crc kubenswrapper[4696]: I0318 16:08:39.941858 4696 generic.go:334] "Generic (PLEG): container finished" podID="f2c235ca-a193-47df-8495-600e7c8eea37" containerID="ad103f1b8ecc9096f697b6819e57391e0f730a1cfd9871563c717dcbc7273312" exitCode=0 Mar 18 16:08:39 crc kubenswrapper[4696]: I0318 16:08:39.941944 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb" event={"ID":"f2c235ca-a193-47df-8495-600e7c8eea37","Type":"ContainerDied","Data":"ad103f1b8ecc9096f697b6819e57391e0f730a1cfd9871563c717dcbc7273312"} Mar 18 16:08:41 crc kubenswrapper[4696]: I0318 16:08:41.393535 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb" Mar 18 16:08:41 crc kubenswrapper[4696]: I0318 16:08:41.416708 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2c235ca-a193-47df-8495-600e7c8eea37-inventory\") pod \"f2c235ca-a193-47df-8495-600e7c8eea37\" (UID: \"f2c235ca-a193-47df-8495-600e7c8eea37\") " Mar 18 16:08:41 crc kubenswrapper[4696]: I0318 16:08:41.416816 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p95hg\" (UniqueName: \"kubernetes.io/projected/f2c235ca-a193-47df-8495-600e7c8eea37-kube-api-access-p95hg\") pod \"f2c235ca-a193-47df-8495-600e7c8eea37\" (UID: \"f2c235ca-a193-47df-8495-600e7c8eea37\") " Mar 18 16:08:41 crc kubenswrapper[4696]: I0318 16:08:41.416955 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2c235ca-a193-47df-8495-600e7c8eea37-ssh-key-openstack-edpm-ipam\") pod \"f2c235ca-a193-47df-8495-600e7c8eea37\" (UID: \"f2c235ca-a193-47df-8495-600e7c8eea37\") " Mar 18 16:08:41 crc kubenswrapper[4696]: I0318 16:08:41.423926 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2c235ca-a193-47df-8495-600e7c8eea37-kube-api-access-p95hg" (OuterVolumeSpecName: "kube-api-access-p95hg") pod "f2c235ca-a193-47df-8495-600e7c8eea37" (UID: "f2c235ca-a193-47df-8495-600e7c8eea37"). InnerVolumeSpecName "kube-api-access-p95hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:08:41 crc kubenswrapper[4696]: I0318 16:08:41.445729 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2c235ca-a193-47df-8495-600e7c8eea37-inventory" (OuterVolumeSpecName: "inventory") pod "f2c235ca-a193-47df-8495-600e7c8eea37" (UID: "f2c235ca-a193-47df-8495-600e7c8eea37"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:08:41 crc kubenswrapper[4696]: I0318 16:08:41.445816 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2c235ca-a193-47df-8495-600e7c8eea37-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f2c235ca-a193-47df-8495-600e7c8eea37" (UID: "f2c235ca-a193-47df-8495-600e7c8eea37"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:08:41 crc kubenswrapper[4696]: I0318 16:08:41.518642 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p95hg\" (UniqueName: \"kubernetes.io/projected/f2c235ca-a193-47df-8495-600e7c8eea37-kube-api-access-p95hg\") on node \"crc\" DevicePath \"\"" Mar 18 16:08:41 crc kubenswrapper[4696]: I0318 16:08:41.518672 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2c235ca-a193-47df-8495-600e7c8eea37-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:08:41 crc kubenswrapper[4696]: I0318 16:08:41.518685 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2c235ca-a193-47df-8495-600e7c8eea37-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:08:41 crc kubenswrapper[4696]: I0318 16:08:41.963583 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb" Mar 18 16:08:41 crc kubenswrapper[4696]: I0318 16:08:41.963630 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb" event={"ID":"f2c235ca-a193-47df-8495-600e7c8eea37","Type":"ContainerDied","Data":"5ed432ecc12a852b0082a6132b99a654b060194572aadcb5eead0f2d084fcf8d"} Mar 18 16:08:41 crc kubenswrapper[4696]: I0318 16:08:41.963671 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ed432ecc12a852b0082a6132b99a654b060194572aadcb5eead0f2d084fcf8d" Mar 18 16:08:42 crc kubenswrapper[4696]: I0318 16:08:42.052463 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zxchz"] Mar 18 16:08:42 crc kubenswrapper[4696]: E0318 16:08:42.052931 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f401c295-b79a-4193-a1a7-f40d9b42a96a" containerName="oc" Mar 18 16:08:42 crc kubenswrapper[4696]: I0318 16:08:42.052954 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f401c295-b79a-4193-a1a7-f40d9b42a96a" containerName="oc" Mar 18 16:08:42 crc kubenswrapper[4696]: E0318 16:08:42.052968 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2c235ca-a193-47df-8495-600e7c8eea37" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 16:08:42 crc kubenswrapper[4696]: I0318 16:08:42.052976 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2c235ca-a193-47df-8495-600e7c8eea37" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 16:08:42 crc kubenswrapper[4696]: I0318 16:08:42.053512 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2c235ca-a193-47df-8495-600e7c8eea37" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 16:08:42 crc kubenswrapper[4696]: I0318 16:08:42.053558 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f401c295-b79a-4193-a1a7-f40d9b42a96a" containerName="oc" Mar 18 16:08:42 crc kubenswrapper[4696]: I0318 16:08:42.054349 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zxchz" Mar 18 16:08:42 crc kubenswrapper[4696]: I0318 16:08:42.058294 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:08:42 crc kubenswrapper[4696]: I0318 16:08:42.058868 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:08:42 crc kubenswrapper[4696]: I0318 16:08:42.058909 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g8zsj" Mar 18 16:08:42 crc kubenswrapper[4696]: I0318 16:08:42.059379 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:08:42 crc kubenswrapper[4696]: I0318 16:08:42.063539 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zxchz"] Mar 18 16:08:42 crc kubenswrapper[4696]: I0318 16:08:42.134075 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ffcf2496-8e16-4355-863a-7cad2e2357fe-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zxchz\" (UID: \"ffcf2496-8e16-4355-863a-7cad2e2357fe\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zxchz" Mar 18 16:08:42 crc kubenswrapper[4696]: I0318 16:08:42.134146 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtrs6\" (UniqueName: \"kubernetes.io/projected/ffcf2496-8e16-4355-863a-7cad2e2357fe-kube-api-access-dtrs6\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zxchz\" (UID: \"ffcf2496-8e16-4355-863a-7cad2e2357fe\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zxchz" Mar 18 16:08:42 crc kubenswrapper[4696]: I0318 16:08:42.134677 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffcf2496-8e16-4355-863a-7cad2e2357fe-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zxchz\" (UID: \"ffcf2496-8e16-4355-863a-7cad2e2357fe\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zxchz" Mar 18 16:08:42 crc kubenswrapper[4696]: I0318 16:08:42.236486 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffcf2496-8e16-4355-863a-7cad2e2357fe-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zxchz\" (UID: \"ffcf2496-8e16-4355-863a-7cad2e2357fe\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zxchz" Mar 18 16:08:42 crc kubenswrapper[4696]: I0318 16:08:42.236598 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ffcf2496-8e16-4355-863a-7cad2e2357fe-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zxchz\" (UID: \"ffcf2496-8e16-4355-863a-7cad2e2357fe\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zxchz" Mar 18 16:08:42 crc kubenswrapper[4696]: I0318 16:08:42.236627 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtrs6\" (UniqueName: \"kubernetes.io/projected/ffcf2496-8e16-4355-863a-7cad2e2357fe-kube-api-access-dtrs6\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zxchz\" (UID: \"ffcf2496-8e16-4355-863a-7cad2e2357fe\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zxchz" Mar 18 16:08:42 crc kubenswrapper[4696]: I0318 16:08:42.242786 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ffcf2496-8e16-4355-863a-7cad2e2357fe-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zxchz\" (UID: \"ffcf2496-8e16-4355-863a-7cad2e2357fe\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zxchz" Mar 18 16:08:42 crc kubenswrapper[4696]: I0318 16:08:42.253102 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffcf2496-8e16-4355-863a-7cad2e2357fe-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zxchz\" (UID: \"ffcf2496-8e16-4355-863a-7cad2e2357fe\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zxchz" Mar 18 16:08:42 crc kubenswrapper[4696]: I0318 16:08:42.254649 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtrs6\" (UniqueName: \"kubernetes.io/projected/ffcf2496-8e16-4355-863a-7cad2e2357fe-kube-api-access-dtrs6\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zxchz\" (UID: \"ffcf2496-8e16-4355-863a-7cad2e2357fe\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zxchz" Mar 18 16:08:42 crc kubenswrapper[4696]: I0318 16:08:42.390330 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zxchz" Mar 18 16:08:42 crc kubenswrapper[4696]: I0318 16:08:42.972352 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zxchz"] Mar 18 16:08:43 crc kubenswrapper[4696]: I0318 16:08:43.980099 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zxchz" event={"ID":"ffcf2496-8e16-4355-863a-7cad2e2357fe","Type":"ContainerStarted","Data":"43c1dacace486ea59f852032850ff84468c68bb08fb94f7c44a3bf1de2dda555"} Mar 18 16:08:43 crc kubenswrapper[4696]: I0318 16:08:43.980680 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zxchz" event={"ID":"ffcf2496-8e16-4355-863a-7cad2e2357fe","Type":"ContainerStarted","Data":"97a55d83424e375f9b35993d2e152c37b6adf83be84ec0d8ffae6e0ef46acd68"} Mar 18 16:08:43 crc kubenswrapper[4696]: I0318 16:08:43.999035 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zxchz" podStartSLOduration=1.820964993 podStartE2EDuration="1.9990135s" podCreationTimestamp="2026-03-18 16:08:42 +0000 UTC" firstStartedPulling="2026-03-18 16:08:42.980794064 +0000 UTC m=+1965.986968280" lastFinishedPulling="2026-03-18 16:08:43.158842581 +0000 UTC m=+1966.165016787" observedRunningTime="2026-03-18 16:08:43.992885986 +0000 UTC m=+1966.999060202" watchObservedRunningTime="2026-03-18 16:08:43.9990135 +0000 UTC m=+1967.005187706" Mar 18 16:08:48 crc kubenswrapper[4696]: I0318 16:08:48.010922 4696 generic.go:334] "Generic (PLEG): container finished" podID="ffcf2496-8e16-4355-863a-7cad2e2357fe" containerID="43c1dacace486ea59f852032850ff84468c68bb08fb94f7c44a3bf1de2dda555" exitCode=0 Mar 18 16:08:48 crc kubenswrapper[4696]: I0318 16:08:48.011015 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zxchz" event={"ID":"ffcf2496-8e16-4355-863a-7cad2e2357fe","Type":"ContainerDied","Data":"43c1dacace486ea59f852032850ff84468c68bb08fb94f7c44a3bf1de2dda555"} Mar 18 16:08:49 crc kubenswrapper[4696]: I0318 16:08:49.424275 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zxchz" Mar 18 16:08:49 crc kubenswrapper[4696]: I0318 16:08:49.572850 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffcf2496-8e16-4355-863a-7cad2e2357fe-inventory\") pod \"ffcf2496-8e16-4355-863a-7cad2e2357fe\" (UID: \"ffcf2496-8e16-4355-863a-7cad2e2357fe\") " Mar 18 16:08:49 crc kubenswrapper[4696]: I0318 16:08:49.572911 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtrs6\" (UniqueName: \"kubernetes.io/projected/ffcf2496-8e16-4355-863a-7cad2e2357fe-kube-api-access-dtrs6\") pod \"ffcf2496-8e16-4355-863a-7cad2e2357fe\" (UID: \"ffcf2496-8e16-4355-863a-7cad2e2357fe\") " Mar 18 16:08:49 crc kubenswrapper[4696]: I0318 16:08:49.572998 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ffcf2496-8e16-4355-863a-7cad2e2357fe-ssh-key-openstack-edpm-ipam\") pod \"ffcf2496-8e16-4355-863a-7cad2e2357fe\" (UID: \"ffcf2496-8e16-4355-863a-7cad2e2357fe\") " Mar 18 16:08:49 crc kubenswrapper[4696]: I0318 16:08:49.579014 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffcf2496-8e16-4355-863a-7cad2e2357fe-kube-api-access-dtrs6" (OuterVolumeSpecName: "kube-api-access-dtrs6") pod "ffcf2496-8e16-4355-863a-7cad2e2357fe" (UID: "ffcf2496-8e16-4355-863a-7cad2e2357fe"). InnerVolumeSpecName "kube-api-access-dtrs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:08:49 crc kubenswrapper[4696]: I0318 16:08:49.600299 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffcf2496-8e16-4355-863a-7cad2e2357fe-inventory" (OuterVolumeSpecName: "inventory") pod "ffcf2496-8e16-4355-863a-7cad2e2357fe" (UID: "ffcf2496-8e16-4355-863a-7cad2e2357fe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:08:49 crc kubenswrapper[4696]: I0318 16:08:49.609910 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffcf2496-8e16-4355-863a-7cad2e2357fe-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ffcf2496-8e16-4355-863a-7cad2e2357fe" (UID: "ffcf2496-8e16-4355-863a-7cad2e2357fe"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:08:49 crc kubenswrapper[4696]: I0318 16:08:49.675579 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ffcf2496-8e16-4355-863a-7cad2e2357fe-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:08:49 crc kubenswrapper[4696]: I0318 16:08:49.675625 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffcf2496-8e16-4355-863a-7cad2e2357fe-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:08:49 crc kubenswrapper[4696]: I0318 16:08:49.675637 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtrs6\" (UniqueName: \"kubernetes.io/projected/ffcf2496-8e16-4355-863a-7cad2e2357fe-kube-api-access-dtrs6\") on node \"crc\" DevicePath \"\"" Mar 18 16:08:50 crc kubenswrapper[4696]: I0318 16:08:50.029968 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zxchz" event={"ID":"ffcf2496-8e16-4355-863a-7cad2e2357fe","Type":"ContainerDied","Data":"97a55d83424e375f9b35993d2e152c37b6adf83be84ec0d8ffae6e0ef46acd68"} Mar 18 16:08:50 crc kubenswrapper[4696]: I0318 16:08:50.030015 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97a55d83424e375f9b35993d2e152c37b6adf83be84ec0d8ffae6e0ef46acd68" Mar 18 16:08:50 crc kubenswrapper[4696]: I0318 16:08:50.030078 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zxchz" Mar 18 16:08:50 crc kubenswrapper[4696]: I0318 16:08:50.107352 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-8qtfd"] Mar 18 16:08:50 crc kubenswrapper[4696]: E0318 16:08:50.107911 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffcf2496-8e16-4355-863a-7cad2e2357fe" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 16:08:50 crc kubenswrapper[4696]: I0318 16:08:50.107935 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffcf2496-8e16-4355-863a-7cad2e2357fe" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 16:08:50 crc kubenswrapper[4696]: I0318 16:08:50.108146 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffcf2496-8e16-4355-863a-7cad2e2357fe" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 16:08:50 crc kubenswrapper[4696]: I0318 16:08:50.108951 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8qtfd" Mar 18 16:08:50 crc kubenswrapper[4696]: I0318 16:08:50.110815 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:08:50 crc kubenswrapper[4696]: I0318 16:08:50.110929 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:08:50 crc kubenswrapper[4696]: I0318 16:08:50.111730 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:08:50 crc kubenswrapper[4696]: I0318 16:08:50.112220 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g8zsj" Mar 18 16:08:50 crc kubenswrapper[4696]: I0318 16:08:50.118011 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-8qtfd"] Mar 18 16:08:50 crc kubenswrapper[4696]: I0318 16:08:50.286664 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa8fa732-917d-4782-aa47-b1846179b603-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8qtfd\" (UID: \"aa8fa732-917d-4782-aa47-b1846179b603\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8qtfd" Mar 18 16:08:50 crc kubenswrapper[4696]: I0318 16:08:50.286738 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9clb\" (UniqueName: \"kubernetes.io/projected/aa8fa732-917d-4782-aa47-b1846179b603-kube-api-access-x9clb\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8qtfd\" (UID: \"aa8fa732-917d-4782-aa47-b1846179b603\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8qtfd" Mar 18 16:08:50 crc kubenswrapper[4696]: I0318 16:08:50.288596 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aa8fa732-917d-4782-aa47-b1846179b603-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8qtfd\" (UID: \"aa8fa732-917d-4782-aa47-b1846179b603\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8qtfd" Mar 18 16:08:50 crc kubenswrapper[4696]: I0318 16:08:50.391011 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa8fa732-917d-4782-aa47-b1846179b603-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8qtfd\" (UID: \"aa8fa732-917d-4782-aa47-b1846179b603\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8qtfd" Mar 18 16:08:50 crc kubenswrapper[4696]: I0318 16:08:50.391073 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9clb\" (UniqueName: \"kubernetes.io/projected/aa8fa732-917d-4782-aa47-b1846179b603-kube-api-access-x9clb\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8qtfd\" (UID: \"aa8fa732-917d-4782-aa47-b1846179b603\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8qtfd" Mar 18 16:08:50 crc kubenswrapper[4696]: I0318 16:08:50.391157 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aa8fa732-917d-4782-aa47-b1846179b603-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8qtfd\" (UID: \"aa8fa732-917d-4782-aa47-b1846179b603\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8qtfd" Mar 18 16:08:50 crc kubenswrapper[4696]: I0318 16:08:50.396036 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa8fa732-917d-4782-aa47-b1846179b603-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8qtfd\" (UID: \"aa8fa732-917d-4782-aa47-b1846179b603\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8qtfd" Mar 18 16:08:50 crc kubenswrapper[4696]: I0318 16:08:50.402507 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aa8fa732-917d-4782-aa47-b1846179b603-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8qtfd\" (UID: \"aa8fa732-917d-4782-aa47-b1846179b603\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8qtfd" Mar 18 16:08:50 crc kubenswrapper[4696]: I0318 16:08:50.410305 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9clb\" (UniqueName: \"kubernetes.io/projected/aa8fa732-917d-4782-aa47-b1846179b603-kube-api-access-x9clb\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-8qtfd\" (UID: \"aa8fa732-917d-4782-aa47-b1846179b603\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8qtfd" Mar 18 16:08:50 crc kubenswrapper[4696]: I0318 16:08:50.439192 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8qtfd" Mar 18 16:08:50 crc kubenswrapper[4696]: I0318 16:08:50.922724 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-8qtfd"] Mar 18 16:08:51 crc kubenswrapper[4696]: I0318 16:08:51.040176 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8qtfd" event={"ID":"aa8fa732-917d-4782-aa47-b1846179b603","Type":"ContainerStarted","Data":"8b1b0a5d08f0d86b781ecbb0d60d837479fb4e0666984b31480d9258e1fc3d2e"} Mar 18 16:08:52 crc kubenswrapper[4696]: I0318 16:08:52.052002 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8qtfd" event={"ID":"aa8fa732-917d-4782-aa47-b1846179b603","Type":"ContainerStarted","Data":"632e756fe98f0d29de6246f17da6d10439f47847865e3ece75deb5a10bec129f"} Mar 18 16:08:52 crc kubenswrapper[4696]: I0318 16:08:52.076762 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8qtfd" podStartSLOduration=1.904981044 podStartE2EDuration="2.076738253s" podCreationTimestamp="2026-03-18 16:08:50 +0000 UTC" firstStartedPulling="2026-03-18 16:08:50.924497925 +0000 UTC m=+1973.930672131" lastFinishedPulling="2026-03-18 16:08:51.096255114 +0000 UTC m=+1974.102429340" observedRunningTime="2026-03-18 16:08:52.067041479 +0000 UTC m=+1975.073215705" watchObservedRunningTime="2026-03-18 16:08:52.076738253 +0000 UTC m=+1975.082912479" Mar 18 16:08:54 crc kubenswrapper[4696]: I0318 16:08:54.045348 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-b9zfd"] Mar 18 16:08:54 crc kubenswrapper[4696]: I0318 16:08:54.052926 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-b9zfd"] Mar 18 16:08:55 crc kubenswrapper[4696]: I0318 16:08:55.608830 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02c1777e-d983-4003-a570-ce8c867cb635" path="/var/lib/kubelet/pods/02c1777e-d983-4003-a570-ce8c867cb635/volumes" Mar 18 16:09:21 crc kubenswrapper[4696]: I0318 16:09:21.917894 4696 scope.go:117] "RemoveContainer" containerID="b3e300c1debfef4904b1384978e21660987ce8668b8d5bc6645377fc4fd68e6d" Mar 18 16:09:21 crc kubenswrapper[4696]: I0318 16:09:21.950566 4696 scope.go:117] "RemoveContainer" containerID="8caa3000254ac99a134d89c5d4235ddc93922c7ed7b9e1bdc27fe7ff7fa9305d" Mar 18 16:09:22 crc kubenswrapper[4696]: I0318 16:09:22.035995 4696 scope.go:117] "RemoveContainer" containerID="c376fef3ac9ee5e7617d35469707ccc25afd0355f3ebca22d225c423b65f6df4" Mar 18 16:09:22 crc kubenswrapper[4696]: I0318 16:09:22.094431 4696 scope.go:117] "RemoveContainer" containerID="2ba0a95e059696a948f85ffaffff7aa7263f9f6928a81c9152a6782f7a8cd566" Mar 18 16:09:22 crc kubenswrapper[4696]: I0318 16:09:22.154462 4696 scope.go:117] "RemoveContainer" containerID="f0ea14d062097505ffdbb91a2c3e7f04946e03f52b870531d585688773f16b6a" Mar 18 16:09:22 crc kubenswrapper[4696]: I0318 16:09:22.196798 4696 scope.go:117] "RemoveContainer" containerID="30b707128ffd2831d6f66974959577976d5c642f27a6dd268b6f74821dea499d" Mar 18 16:09:22 crc kubenswrapper[4696]: I0318 16:09:22.218559 4696 scope.go:117] "RemoveContainer" containerID="e1cad57bf6e2565598e8f5d7be4ab29511bb53b2d773a1a23a0a2ea71c43c95b" Mar 18 16:09:22 crc kubenswrapper[4696]: I0318 16:09:22.242124 4696 scope.go:117] "RemoveContainer" containerID="c581e429fcd7c983f88db67db759762819b5e9dd6af085b22d1a439d1d6e1076" Mar 18 16:09:22 crc kubenswrapper[4696]: I0318 16:09:22.265767 4696 scope.go:117] "RemoveContainer" containerID="e732a1272e46b31776481c44744f8c475afbcd0066eb328309d46fb2acd03078" Mar 18 16:09:22 crc kubenswrapper[4696]: I0318 16:09:22.296564 4696 scope.go:117] "RemoveContainer" containerID="93559a5afc6b0d765466ad8d35a5ad9e433da98933e1fd3b809da89c53f877f7" Mar 18 16:09:27 crc kubenswrapper[4696]: I0318 16:09:27.420425 4696 generic.go:334] "Generic (PLEG): container finished" podID="aa8fa732-917d-4782-aa47-b1846179b603" containerID="632e756fe98f0d29de6246f17da6d10439f47847865e3ece75deb5a10bec129f" exitCode=0 Mar 18 16:09:27 crc kubenswrapper[4696]: I0318 16:09:27.420529 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8qtfd" event={"ID":"aa8fa732-917d-4782-aa47-b1846179b603","Type":"ContainerDied","Data":"632e756fe98f0d29de6246f17da6d10439f47847865e3ece75deb5a10bec129f"} Mar 18 16:09:28 crc kubenswrapper[4696]: I0318 16:09:28.967258 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8qtfd" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.043081 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aa8fa732-917d-4782-aa47-b1846179b603-ssh-key-openstack-edpm-ipam\") pod \"aa8fa732-917d-4782-aa47-b1846179b603\" (UID: \"aa8fa732-917d-4782-aa47-b1846179b603\") " Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.043157 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa8fa732-917d-4782-aa47-b1846179b603-inventory\") pod \"aa8fa732-917d-4782-aa47-b1846179b603\" (UID: \"aa8fa732-917d-4782-aa47-b1846179b603\") " Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.043244 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9clb\" (UniqueName: \"kubernetes.io/projected/aa8fa732-917d-4782-aa47-b1846179b603-kube-api-access-x9clb\") pod \"aa8fa732-917d-4782-aa47-b1846179b603\" (UID: \"aa8fa732-917d-4782-aa47-b1846179b603\") " Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.049349 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa8fa732-917d-4782-aa47-b1846179b603-kube-api-access-x9clb" (OuterVolumeSpecName: "kube-api-access-x9clb") pod "aa8fa732-917d-4782-aa47-b1846179b603" (UID: "aa8fa732-917d-4782-aa47-b1846179b603"). InnerVolumeSpecName "kube-api-access-x9clb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.069038 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa8fa732-917d-4782-aa47-b1846179b603-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "aa8fa732-917d-4782-aa47-b1846179b603" (UID: "aa8fa732-917d-4782-aa47-b1846179b603"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.072904 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa8fa732-917d-4782-aa47-b1846179b603-inventory" (OuterVolumeSpecName: "inventory") pod "aa8fa732-917d-4782-aa47-b1846179b603" (UID: "aa8fa732-917d-4782-aa47-b1846179b603"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.145733 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aa8fa732-917d-4782-aa47-b1846179b603-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.145772 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa8fa732-917d-4782-aa47-b1846179b603-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.145784 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9clb\" (UniqueName: \"kubernetes.io/projected/aa8fa732-917d-4782-aa47-b1846179b603-kube-api-access-x9clb\") on node \"crc\" DevicePath \"\"" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.298005 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2kxsm"] Mar 18 16:09:29 crc kubenswrapper[4696]: E0318 16:09:29.298602 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa8fa732-917d-4782-aa47-b1846179b603" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.298641 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa8fa732-917d-4782-aa47-b1846179b603" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.298913 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa8fa732-917d-4782-aa47-b1846179b603" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.300899 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2kxsm" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.306949 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2kxsm"] Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.458012 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deac328f-6ccb-4c04-a220-e7e77442e4d8-utilities\") pod \"redhat-marketplace-2kxsm\" (UID: \"deac328f-6ccb-4c04-a220-e7e77442e4d8\") " pod="openshift-marketplace/redhat-marketplace-2kxsm" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.458158 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwx4x\" (UniqueName: \"kubernetes.io/projected/deac328f-6ccb-4c04-a220-e7e77442e4d8-kube-api-access-pwx4x\") pod \"redhat-marketplace-2kxsm\" (UID: \"deac328f-6ccb-4c04-a220-e7e77442e4d8\") " pod="openshift-marketplace/redhat-marketplace-2kxsm" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.459123 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deac328f-6ccb-4c04-a220-e7e77442e4d8-catalog-content\") pod \"redhat-marketplace-2kxsm\" (UID: \"deac328f-6ccb-4c04-a220-e7e77442e4d8\") " pod="openshift-marketplace/redhat-marketplace-2kxsm" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.459223 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8qtfd" event={"ID":"aa8fa732-917d-4782-aa47-b1846179b603","Type":"ContainerDied","Data":"8b1b0a5d08f0d86b781ecbb0d60d837479fb4e0666984b31480d9258e1fc3d2e"} Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.459280 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b1b0a5d08f0d86b781ecbb0d60d837479fb4e0666984b31480d9258e1fc3d2e" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.459366 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-8qtfd" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.529638 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7"] Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.530927 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.534680 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.534764 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.534940 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.535101 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g8zsj" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.544742 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7"] Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.560769 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deac328f-6ccb-4c04-a220-e7e77442e4d8-utilities\") pod \"redhat-marketplace-2kxsm\" (UID: \"deac328f-6ccb-4c04-a220-e7e77442e4d8\") " pod="openshift-marketplace/redhat-marketplace-2kxsm" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.560824 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwx4x\" (UniqueName: \"kubernetes.io/projected/deac328f-6ccb-4c04-a220-e7e77442e4d8-kube-api-access-pwx4x\") pod \"redhat-marketplace-2kxsm\" (UID: \"deac328f-6ccb-4c04-a220-e7e77442e4d8\") " pod="openshift-marketplace/redhat-marketplace-2kxsm" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.560884 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deac328f-6ccb-4c04-a220-e7e77442e4d8-catalog-content\") pod \"redhat-marketplace-2kxsm\" (UID: \"deac328f-6ccb-4c04-a220-e7e77442e4d8\") " pod="openshift-marketplace/redhat-marketplace-2kxsm" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.561233 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deac328f-6ccb-4c04-a220-e7e77442e4d8-utilities\") pod \"redhat-marketplace-2kxsm\" (UID: \"deac328f-6ccb-4c04-a220-e7e77442e4d8\") " pod="openshift-marketplace/redhat-marketplace-2kxsm" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.561260 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deac328f-6ccb-4c04-a220-e7e77442e4d8-catalog-content\") pod \"redhat-marketplace-2kxsm\" (UID: \"deac328f-6ccb-4c04-a220-e7e77442e4d8\") " pod="openshift-marketplace/redhat-marketplace-2kxsm" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.580574 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwx4x\" (UniqueName: \"kubernetes.io/projected/deac328f-6ccb-4c04-a220-e7e77442e4d8-kube-api-access-pwx4x\") pod \"redhat-marketplace-2kxsm\" (UID: \"deac328f-6ccb-4c04-a220-e7e77442e4d8\") " pod="openshift-marketplace/redhat-marketplace-2kxsm" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.624475 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2kxsm" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.663060 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqfwj\" (UniqueName: \"kubernetes.io/projected/3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6-kube-api-access-nqfwj\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7\" (UID: \"3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.663174 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7\" (UID: \"3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.663216 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7\" (UID: \"3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.765021 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqfwj\" (UniqueName: \"kubernetes.io/projected/3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6-kube-api-access-nqfwj\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7\" (UID: \"3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.765128 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7\" (UID: \"3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.765160 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7\" (UID: \"3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.770764 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7\" (UID: \"3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.771231 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7\" (UID: \"3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.789215 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqfwj\" (UniqueName: \"kubernetes.io/projected/3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6-kube-api-access-nqfwj\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7\" (UID: \"3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7" Mar 18 16:09:29 crc kubenswrapper[4696]: I0318 16:09:29.850812 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7" Mar 18 16:09:30 crc kubenswrapper[4696]: I0318 16:09:30.078455 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2kxsm"] Mar 18 16:09:30 crc kubenswrapper[4696]: I0318 16:09:30.375468 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7"] Mar 18 16:09:30 crc kubenswrapper[4696]: W0318 16:09:30.384193 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ffbd0db_84b3_4593_a9f8_7f61bf72fdc6.slice/crio-119a81aec1156dcaccef277e5391c1dcd067ecd39ded4a4473d8ed230e107b51 WatchSource:0}: Error finding container 119a81aec1156dcaccef277e5391c1dcd067ecd39ded4a4473d8ed230e107b51: Status 404 returned error can't find the container with id 119a81aec1156dcaccef277e5391c1dcd067ecd39ded4a4473d8ed230e107b51 Mar 18 16:09:30 crc kubenswrapper[4696]: I0318 16:09:30.469455 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7" event={"ID":"3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6","Type":"ContainerStarted","Data":"119a81aec1156dcaccef277e5391c1dcd067ecd39ded4a4473d8ed230e107b51"} Mar 18 16:09:30 crc kubenswrapper[4696]: I0318 16:09:30.471580 4696 generic.go:334] "Generic (PLEG): container finished" podID="deac328f-6ccb-4c04-a220-e7e77442e4d8" containerID="9fca7ef64d21c7592494de98062e84952a9f7fa9f7928eab80bfe72a514e75a2" exitCode=0 Mar 18 16:09:30 crc kubenswrapper[4696]: I0318 16:09:30.471612 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2kxsm" event={"ID":"deac328f-6ccb-4c04-a220-e7e77442e4d8","Type":"ContainerDied","Data":"9fca7ef64d21c7592494de98062e84952a9f7fa9f7928eab80bfe72a514e75a2"} Mar 18 16:09:30 crc kubenswrapper[4696]: I0318 16:09:30.471630 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2kxsm" event={"ID":"deac328f-6ccb-4c04-a220-e7e77442e4d8","Type":"ContainerStarted","Data":"6bfd9b7d323c032bde20b3adbdc78440d27a8d07f5432770b7415cd04c9542c1"} Mar 18 16:09:31 crc kubenswrapper[4696]: I0318 16:09:31.481972 4696 generic.go:334] "Generic (PLEG): container finished" podID="deac328f-6ccb-4c04-a220-e7e77442e4d8" containerID="eddab8867e710721156fc48c70e601e7b4de57ab343bcd538e6969470a8897d3" exitCode=0 Mar 18 16:09:31 crc kubenswrapper[4696]: I0318 16:09:31.482090 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2kxsm" event={"ID":"deac328f-6ccb-4c04-a220-e7e77442e4d8","Type":"ContainerDied","Data":"eddab8867e710721156fc48c70e601e7b4de57ab343bcd538e6969470a8897d3"} Mar 18 16:09:31 crc kubenswrapper[4696]: I0318 16:09:31.485372 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7" event={"ID":"3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6","Type":"ContainerStarted","Data":"4c0196b1981d64eec622b8597556c6ffce77f70b7dfec3ea00cde87612e93efb"} Mar 18 16:09:31 crc kubenswrapper[4696]: I0318 16:09:31.539390 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7" podStartSLOduration=2.351560686 podStartE2EDuration="2.539372008s" podCreationTimestamp="2026-03-18 16:09:29 +0000 UTC" firstStartedPulling="2026-03-18 16:09:30.388449813 +0000 UTC m=+2013.394624019" lastFinishedPulling="2026-03-18 16:09:30.576261135 +0000 UTC m=+2013.582435341" observedRunningTime="2026-03-18 16:09:31.532959707 +0000 UTC m=+2014.539133923" watchObservedRunningTime="2026-03-18 16:09:31.539372008 +0000 UTC m=+2014.545546214" Mar 18 16:09:32 crc kubenswrapper[4696]: I0318 16:09:32.498304 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2kxsm" event={"ID":"deac328f-6ccb-4c04-a220-e7e77442e4d8","Type":"ContainerStarted","Data":"534b9322497df9e07ea8ff09922c2b66843cff017d07e31b336b9f5533d97f84"} Mar 18 16:09:32 crc kubenswrapper[4696]: I0318 16:09:32.516933 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2kxsm" podStartSLOduration=2.071234263 podStartE2EDuration="3.516916113s" podCreationTimestamp="2026-03-18 16:09:29 +0000 UTC" firstStartedPulling="2026-03-18 16:09:30.473047105 +0000 UTC m=+2013.479221301" lastFinishedPulling="2026-03-18 16:09:31.918728945 +0000 UTC m=+2014.924903151" observedRunningTime="2026-03-18 16:09:32.515645031 +0000 UTC m=+2015.521819257" watchObservedRunningTime="2026-03-18 16:09:32.516916113 +0000 UTC m=+2015.523090319" Mar 18 16:09:39 crc kubenswrapper[4696]: I0318 16:09:39.625375 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2kxsm" Mar 18 16:09:39 crc kubenswrapper[4696]: I0318 16:09:39.625982 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2kxsm" Mar 18 16:09:39 crc kubenswrapper[4696]: I0318 16:09:39.673611 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2kxsm" Mar 18 16:09:40 crc kubenswrapper[4696]: I0318 16:09:40.644154 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2kxsm" Mar 18 16:09:40 crc kubenswrapper[4696]: I0318 16:09:40.698989 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2kxsm"] Mar 18 16:09:42 crc kubenswrapper[4696]: I0318 16:09:42.605400 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2kxsm" podUID="deac328f-6ccb-4c04-a220-e7e77442e4d8" containerName="registry-server" containerID="cri-o://534b9322497df9e07ea8ff09922c2b66843cff017d07e31b336b9f5533d97f84" gracePeriod=2 Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.537413 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2kxsm" Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.612873 4696 generic.go:334] "Generic (PLEG): container finished" podID="deac328f-6ccb-4c04-a220-e7e77442e4d8" containerID="534b9322497df9e07ea8ff09922c2b66843cff017d07e31b336b9f5533d97f84" exitCode=0 Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.613917 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2kxsm" event={"ID":"deac328f-6ccb-4c04-a220-e7e77442e4d8","Type":"ContainerDied","Data":"534b9322497df9e07ea8ff09922c2b66843cff017d07e31b336b9f5533d97f84"} Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.614053 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2kxsm" event={"ID":"deac328f-6ccb-4c04-a220-e7e77442e4d8","Type":"ContainerDied","Data":"6bfd9b7d323c032bde20b3adbdc78440d27a8d07f5432770b7415cd04c9542c1"} Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.613960 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2kxsm" Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.614138 4696 scope.go:117] "RemoveContainer" containerID="534b9322497df9e07ea8ff09922c2b66843cff017d07e31b336b9f5533d97f84" Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.637660 4696 scope.go:117] "RemoveContainer" containerID="eddab8867e710721156fc48c70e601e7b4de57ab343bcd538e6969470a8897d3" Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.658167 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deac328f-6ccb-4c04-a220-e7e77442e4d8-utilities\") pod \"deac328f-6ccb-4c04-a220-e7e77442e4d8\" (UID: \"deac328f-6ccb-4c04-a220-e7e77442e4d8\") " Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.658255 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwx4x\" (UniqueName: \"kubernetes.io/projected/deac328f-6ccb-4c04-a220-e7e77442e4d8-kube-api-access-pwx4x\") pod \"deac328f-6ccb-4c04-a220-e7e77442e4d8\" (UID: \"deac328f-6ccb-4c04-a220-e7e77442e4d8\") " Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.658481 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deac328f-6ccb-4c04-a220-e7e77442e4d8-catalog-content\") pod \"deac328f-6ccb-4c04-a220-e7e77442e4d8\" (UID: \"deac328f-6ccb-4c04-a220-e7e77442e4d8\") " Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.659593 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deac328f-6ccb-4c04-a220-e7e77442e4d8-utilities" (OuterVolumeSpecName: "utilities") pod "deac328f-6ccb-4c04-a220-e7e77442e4d8" (UID: "deac328f-6ccb-4c04-a220-e7e77442e4d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.660630 4696 scope.go:117] "RemoveContainer" containerID="9fca7ef64d21c7592494de98062e84952a9f7fa9f7928eab80bfe72a514e75a2" Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.664779 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deac328f-6ccb-4c04-a220-e7e77442e4d8-kube-api-access-pwx4x" (OuterVolumeSpecName: "kube-api-access-pwx4x") pod "deac328f-6ccb-4c04-a220-e7e77442e4d8" (UID: "deac328f-6ccb-4c04-a220-e7e77442e4d8"). InnerVolumeSpecName "kube-api-access-pwx4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.692817 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deac328f-6ccb-4c04-a220-e7e77442e4d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "deac328f-6ccb-4c04-a220-e7e77442e4d8" (UID: "deac328f-6ccb-4c04-a220-e7e77442e4d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.744194 4696 scope.go:117] "RemoveContainer" containerID="534b9322497df9e07ea8ff09922c2b66843cff017d07e31b336b9f5533d97f84" Mar 18 16:09:43 crc kubenswrapper[4696]: E0318 16:09:43.744687 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"534b9322497df9e07ea8ff09922c2b66843cff017d07e31b336b9f5533d97f84\": container with ID starting with 534b9322497df9e07ea8ff09922c2b66843cff017d07e31b336b9f5533d97f84 not found: ID does not exist" containerID="534b9322497df9e07ea8ff09922c2b66843cff017d07e31b336b9f5533d97f84" Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.744753 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"534b9322497df9e07ea8ff09922c2b66843cff017d07e31b336b9f5533d97f84"} err="failed to get container status \"534b9322497df9e07ea8ff09922c2b66843cff017d07e31b336b9f5533d97f84\": rpc error: code = NotFound desc = could not find container \"534b9322497df9e07ea8ff09922c2b66843cff017d07e31b336b9f5533d97f84\": container with ID starting with 534b9322497df9e07ea8ff09922c2b66843cff017d07e31b336b9f5533d97f84 not found: ID does not exist" Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.744781 4696 scope.go:117] "RemoveContainer" containerID="eddab8867e710721156fc48c70e601e7b4de57ab343bcd538e6969470a8897d3" Mar 18 16:09:43 crc kubenswrapper[4696]: E0318 16:09:43.745217 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eddab8867e710721156fc48c70e601e7b4de57ab343bcd538e6969470a8897d3\": container with ID starting with eddab8867e710721156fc48c70e601e7b4de57ab343bcd538e6969470a8897d3 not found: ID does not exist" containerID="eddab8867e710721156fc48c70e601e7b4de57ab343bcd538e6969470a8897d3" Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.745257 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eddab8867e710721156fc48c70e601e7b4de57ab343bcd538e6969470a8897d3"} err="failed to get container status \"eddab8867e710721156fc48c70e601e7b4de57ab343bcd538e6969470a8897d3\": rpc error: code = NotFound desc = could not find container \"eddab8867e710721156fc48c70e601e7b4de57ab343bcd538e6969470a8897d3\": container with ID starting with eddab8867e710721156fc48c70e601e7b4de57ab343bcd538e6969470a8897d3 not found: ID does not exist" Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.745283 4696 scope.go:117] "RemoveContainer" containerID="9fca7ef64d21c7592494de98062e84952a9f7fa9f7928eab80bfe72a514e75a2" Mar 18 16:09:43 crc kubenswrapper[4696]: E0318 16:09:43.745601 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fca7ef64d21c7592494de98062e84952a9f7fa9f7928eab80bfe72a514e75a2\": container with ID starting with 9fca7ef64d21c7592494de98062e84952a9f7fa9f7928eab80bfe72a514e75a2 not found: ID does not exist" containerID="9fca7ef64d21c7592494de98062e84952a9f7fa9f7928eab80bfe72a514e75a2" Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.745653 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fca7ef64d21c7592494de98062e84952a9f7fa9f7928eab80bfe72a514e75a2"} err="failed to get container status \"9fca7ef64d21c7592494de98062e84952a9f7fa9f7928eab80bfe72a514e75a2\": rpc error: code = NotFound desc = could not find container \"9fca7ef64d21c7592494de98062e84952a9f7fa9f7928eab80bfe72a514e75a2\": container with ID starting with 9fca7ef64d21c7592494de98062e84952a9f7fa9f7928eab80bfe72a514e75a2 not found: ID does not exist" Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.765891 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deac328f-6ccb-4c04-a220-e7e77442e4d8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.765928 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deac328f-6ccb-4c04-a220-e7e77442e4d8-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.765938 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwx4x\" (UniqueName: \"kubernetes.io/projected/deac328f-6ccb-4c04-a220-e7e77442e4d8-kube-api-access-pwx4x\") on node \"crc\" DevicePath \"\"" Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.947674 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2kxsm"] Mar 18 16:09:43 crc kubenswrapper[4696]: I0318 16:09:43.955821 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2kxsm"] Mar 18 16:09:45 crc kubenswrapper[4696]: I0318 16:09:45.615622 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deac328f-6ccb-4c04-a220-e7e77442e4d8" path="/var/lib/kubelet/pods/deac328f-6ccb-4c04-a220-e7e77442e4d8/volumes" Mar 18 16:09:56 crc kubenswrapper[4696]: I0318 16:09:56.054716 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-m56xp"] Mar 18 16:09:56 crc kubenswrapper[4696]: I0318 16:09:56.063916 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-m56xp"] Mar 18 16:09:57 crc kubenswrapper[4696]: I0318 16:09:57.444472 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gskmp"] Mar 18 16:09:57 crc kubenswrapper[4696]: E0318 16:09:57.445421 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deac328f-6ccb-4c04-a220-e7e77442e4d8" containerName="registry-server" Mar 18 16:09:57 crc kubenswrapper[4696]: I0318 16:09:57.445439 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="deac328f-6ccb-4c04-a220-e7e77442e4d8" containerName="registry-server" Mar 18 16:09:57 crc kubenswrapper[4696]: E0318 16:09:57.445473 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deac328f-6ccb-4c04-a220-e7e77442e4d8" containerName="extract-utilities" Mar 18 16:09:57 crc kubenswrapper[4696]: I0318 16:09:57.445483 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="deac328f-6ccb-4c04-a220-e7e77442e4d8" containerName="extract-utilities" Mar 18 16:09:57 crc kubenswrapper[4696]: E0318 16:09:57.445559 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deac328f-6ccb-4c04-a220-e7e77442e4d8" containerName="extract-content" Mar 18 16:09:57 crc kubenswrapper[4696]: I0318 16:09:57.445570 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="deac328f-6ccb-4c04-a220-e7e77442e4d8" containerName="extract-content" Mar 18 16:09:57 crc kubenswrapper[4696]: I0318 16:09:57.445924 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="deac328f-6ccb-4c04-a220-e7e77442e4d8" containerName="registry-server" Mar 18 16:09:57 crc kubenswrapper[4696]: I0318 16:09:57.452297 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gskmp" Mar 18 16:09:57 crc kubenswrapper[4696]: I0318 16:09:57.463374 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gskmp"] Mar 18 16:09:57 crc kubenswrapper[4696]: I0318 16:09:57.581625 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wqvq\" (UniqueName: \"kubernetes.io/projected/ee9f444d-f1ec-493e-bd45-8ac0c395dd9d-kube-api-access-5wqvq\") pod \"redhat-operators-gskmp\" (UID: \"ee9f444d-f1ec-493e-bd45-8ac0c395dd9d\") " pod="openshift-marketplace/redhat-operators-gskmp" Mar 18 16:09:57 crc kubenswrapper[4696]: I0318 16:09:57.581787 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee9f444d-f1ec-493e-bd45-8ac0c395dd9d-utilities\") pod \"redhat-operators-gskmp\" (UID: \"ee9f444d-f1ec-493e-bd45-8ac0c395dd9d\") " pod="openshift-marketplace/redhat-operators-gskmp" Mar 18 16:09:57 crc kubenswrapper[4696]: I0318 16:09:57.581846 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee9f444d-f1ec-493e-bd45-8ac0c395dd9d-catalog-content\") pod \"redhat-operators-gskmp\" (UID: \"ee9f444d-f1ec-493e-bd45-8ac0c395dd9d\") " pod="openshift-marketplace/redhat-operators-gskmp" Mar 18 16:09:57 crc kubenswrapper[4696]: I0318 16:09:57.611300 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29e3998f-8e86-422d-bfdc-093018e4e311" path="/var/lib/kubelet/pods/29e3998f-8e86-422d-bfdc-093018e4e311/volumes" Mar 18 16:09:57 crc kubenswrapper[4696]: I0318 16:09:57.683513 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wqvq\" (UniqueName: \"kubernetes.io/projected/ee9f444d-f1ec-493e-bd45-8ac0c395dd9d-kube-api-access-5wqvq\") pod \"redhat-operators-gskmp\" (UID: \"ee9f444d-f1ec-493e-bd45-8ac0c395dd9d\") " pod="openshift-marketplace/redhat-operators-gskmp" Mar 18 16:09:57 crc kubenswrapper[4696]: I0318 16:09:57.683697 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee9f444d-f1ec-493e-bd45-8ac0c395dd9d-utilities\") pod \"redhat-operators-gskmp\" (UID: \"ee9f444d-f1ec-493e-bd45-8ac0c395dd9d\") " pod="openshift-marketplace/redhat-operators-gskmp" Mar 18 16:09:57 crc kubenswrapper[4696]: I0318 16:09:57.683750 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee9f444d-f1ec-493e-bd45-8ac0c395dd9d-catalog-content\") pod \"redhat-operators-gskmp\" (UID: \"ee9f444d-f1ec-493e-bd45-8ac0c395dd9d\") " pod="openshift-marketplace/redhat-operators-gskmp" Mar 18 16:09:57 crc kubenswrapper[4696]: I0318 16:09:57.684337 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee9f444d-f1ec-493e-bd45-8ac0c395dd9d-catalog-content\") pod \"redhat-operators-gskmp\" (UID: \"ee9f444d-f1ec-493e-bd45-8ac0c395dd9d\") " pod="openshift-marketplace/redhat-operators-gskmp" Mar 18 16:09:57 crc kubenswrapper[4696]: I0318 16:09:57.684656 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee9f444d-f1ec-493e-bd45-8ac0c395dd9d-utilities\") pod \"redhat-operators-gskmp\" (UID: \"ee9f444d-f1ec-493e-bd45-8ac0c395dd9d\") " pod="openshift-marketplace/redhat-operators-gskmp" Mar 18 16:09:57 crc kubenswrapper[4696]: I0318 16:09:57.705469 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wqvq\" (UniqueName: \"kubernetes.io/projected/ee9f444d-f1ec-493e-bd45-8ac0c395dd9d-kube-api-access-5wqvq\") pod \"redhat-operators-gskmp\" (UID: \"ee9f444d-f1ec-493e-bd45-8ac0c395dd9d\") " pod="openshift-marketplace/redhat-operators-gskmp" Mar 18 16:09:57 crc kubenswrapper[4696]: I0318 16:09:57.795880 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gskmp" Mar 18 16:09:58 crc kubenswrapper[4696]: I0318 16:09:58.307444 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gskmp"] Mar 18 16:09:58 crc kubenswrapper[4696]: I0318 16:09:58.768634 4696 generic.go:334] "Generic (PLEG): container finished" podID="ee9f444d-f1ec-493e-bd45-8ac0c395dd9d" containerID="2b4b4cc07141f78984957efc64a65681497ec3b1bf6bd74b1c2b70579efd3cc6" exitCode=0 Mar 18 16:09:58 crc kubenswrapper[4696]: I0318 16:09:58.771082 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gskmp" event={"ID":"ee9f444d-f1ec-493e-bd45-8ac0c395dd9d","Type":"ContainerDied","Data":"2b4b4cc07141f78984957efc64a65681497ec3b1bf6bd74b1c2b70579efd3cc6"} Mar 18 16:09:58 crc kubenswrapper[4696]: I0318 16:09:58.771266 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gskmp" event={"ID":"ee9f444d-f1ec-493e-bd45-8ac0c395dd9d","Type":"ContainerStarted","Data":"29b2810a9988ef00e3b6968f8d183de69fa0759e23b8af3fa660c817b9f1ebc4"} Mar 18 16:10:00 crc kubenswrapper[4696]: I0318 16:10:00.147151 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564170-wgdnn"] Mar 18 16:10:00 crc kubenswrapper[4696]: I0318 16:10:00.148932 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564170-wgdnn" Mar 18 16:10:00 crc kubenswrapper[4696]: I0318 16:10:00.151072 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:10:00 crc kubenswrapper[4696]: I0318 16:10:00.151907 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:10:00 crc kubenswrapper[4696]: I0318 16:10:00.152041 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:10:00 crc kubenswrapper[4696]: I0318 16:10:00.160867 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564170-wgdnn"] Mar 18 16:10:00 crc kubenswrapper[4696]: I0318 16:10:00.249392 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvqp8\" (UniqueName: \"kubernetes.io/projected/235ffb53-2fcd-4d61-8f87-f5a687615356-kube-api-access-dvqp8\") pod \"auto-csr-approver-29564170-wgdnn\" (UID: \"235ffb53-2fcd-4d61-8f87-f5a687615356\") " pod="openshift-infra/auto-csr-approver-29564170-wgdnn" Mar 18 16:10:00 crc kubenswrapper[4696]: I0318 16:10:00.351116 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvqp8\" (UniqueName: \"kubernetes.io/projected/235ffb53-2fcd-4d61-8f87-f5a687615356-kube-api-access-dvqp8\") pod \"auto-csr-approver-29564170-wgdnn\" (UID: \"235ffb53-2fcd-4d61-8f87-f5a687615356\") " pod="openshift-infra/auto-csr-approver-29564170-wgdnn" Mar 18 16:10:00 crc kubenswrapper[4696]: I0318 16:10:00.371396 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvqp8\" (UniqueName: \"kubernetes.io/projected/235ffb53-2fcd-4d61-8f87-f5a687615356-kube-api-access-dvqp8\") pod \"auto-csr-approver-29564170-wgdnn\" (UID: \"235ffb53-2fcd-4d61-8f87-f5a687615356\") " pod="openshift-infra/auto-csr-approver-29564170-wgdnn" Mar 18 16:10:00 crc kubenswrapper[4696]: I0318 16:10:00.479235 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564170-wgdnn" Mar 18 16:10:00 crc kubenswrapper[4696]: W0318 16:10:00.947077 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod235ffb53_2fcd_4d61_8f87_f5a687615356.slice/crio-0ddb4ae76ffe14f074996e8f32a76a75fd939d4617b695ecb27d44f803c1d649 WatchSource:0}: Error finding container 0ddb4ae76ffe14f074996e8f32a76a75fd939d4617b695ecb27d44f803c1d649: Status 404 returned error can't find the container with id 0ddb4ae76ffe14f074996e8f32a76a75fd939d4617b695ecb27d44f803c1d649 Mar 18 16:10:00 crc kubenswrapper[4696]: I0318 16:10:00.947427 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564170-wgdnn"] Mar 18 16:10:01 crc kubenswrapper[4696]: I0318 16:10:01.802746 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564170-wgdnn" event={"ID":"235ffb53-2fcd-4d61-8f87-f5a687615356","Type":"ContainerStarted","Data":"0ddb4ae76ffe14f074996e8f32a76a75fd939d4617b695ecb27d44f803c1d649"} Mar 18 16:10:02 crc kubenswrapper[4696]: I0318 16:10:02.028872 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2cpbx"] Mar 18 16:10:02 crc kubenswrapper[4696]: I0318 16:10:02.040025 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2cpbx"] Mar 18 16:10:02 crc kubenswrapper[4696]: I0318 16:10:02.812065 4696 generic.go:334] "Generic (PLEG): container finished" podID="235ffb53-2fcd-4d61-8f87-f5a687615356" containerID="f4eaf0eca27fa84a780faeac7fbacda4d2b69d486c5a106b9236c5940c2c0ce2" exitCode=0 Mar 18 16:10:02 crc kubenswrapper[4696]: I0318 16:10:02.812194 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564170-wgdnn" event={"ID":"235ffb53-2fcd-4d61-8f87-f5a687615356","Type":"ContainerDied","Data":"f4eaf0eca27fa84a780faeac7fbacda4d2b69d486c5a106b9236c5940c2c0ce2"} Mar 18 16:10:03 crc kubenswrapper[4696]: I0318 16:10:03.607879 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c1a18ae-c261-421e-907b-e3bb372199a2" path="/var/lib/kubelet/pods/8c1a18ae-c261-421e-907b-e3bb372199a2/volumes" Mar 18 16:10:07 crc kubenswrapper[4696]: I0318 16:10:07.196625 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564170-wgdnn" Mar 18 16:10:07 crc kubenswrapper[4696]: I0318 16:10:07.283593 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvqp8\" (UniqueName: \"kubernetes.io/projected/235ffb53-2fcd-4d61-8f87-f5a687615356-kube-api-access-dvqp8\") pod \"235ffb53-2fcd-4d61-8f87-f5a687615356\" (UID: \"235ffb53-2fcd-4d61-8f87-f5a687615356\") " Mar 18 16:10:07 crc kubenswrapper[4696]: I0318 16:10:07.288973 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/235ffb53-2fcd-4d61-8f87-f5a687615356-kube-api-access-dvqp8" (OuterVolumeSpecName: "kube-api-access-dvqp8") pod "235ffb53-2fcd-4d61-8f87-f5a687615356" (UID: "235ffb53-2fcd-4d61-8f87-f5a687615356"). InnerVolumeSpecName "kube-api-access-dvqp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:10:07 crc kubenswrapper[4696]: I0318 16:10:07.387732 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvqp8\" (UniqueName: \"kubernetes.io/projected/235ffb53-2fcd-4d61-8f87-f5a687615356-kube-api-access-dvqp8\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:07 crc kubenswrapper[4696]: I0318 16:10:07.863200 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gskmp" event={"ID":"ee9f444d-f1ec-493e-bd45-8ac0c395dd9d","Type":"ContainerStarted","Data":"8a0becac611e69353d3ab091a574b3da9c207912c4be9e0031e7af71920f2efd"} Mar 18 16:10:07 crc kubenswrapper[4696]: I0318 16:10:07.865872 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564170-wgdnn" event={"ID":"235ffb53-2fcd-4d61-8f87-f5a687615356","Type":"ContainerDied","Data":"0ddb4ae76ffe14f074996e8f32a76a75fd939d4617b695ecb27d44f803c1d649"} Mar 18 16:10:07 crc kubenswrapper[4696]: I0318 16:10:07.865917 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ddb4ae76ffe14f074996e8f32a76a75fd939d4617b695ecb27d44f803c1d649" Mar 18 16:10:07 crc kubenswrapper[4696]: I0318 16:10:07.865968 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564170-wgdnn" Mar 18 16:10:08 crc kubenswrapper[4696]: I0318 16:10:08.246413 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564164-b48nr"] Mar 18 16:10:08 crc kubenswrapper[4696]: I0318 16:10:08.254027 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564164-b48nr"] Mar 18 16:10:09 crc kubenswrapper[4696]: I0318 16:10:09.608766 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64813596-49e9-4f19-8b88-c4861b9ec490" path="/var/lib/kubelet/pods/64813596-49e9-4f19-8b88-c4861b9ec490/volumes" Mar 18 16:10:10 crc kubenswrapper[4696]: I0318 16:10:10.888060 4696 generic.go:334] "Generic (PLEG): container finished" podID="ee9f444d-f1ec-493e-bd45-8ac0c395dd9d" containerID="8a0becac611e69353d3ab091a574b3da9c207912c4be9e0031e7af71920f2efd" exitCode=0 Mar 18 16:10:10 crc kubenswrapper[4696]: I0318 16:10:10.888112 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gskmp" event={"ID":"ee9f444d-f1ec-493e-bd45-8ac0c395dd9d","Type":"ContainerDied","Data":"8a0becac611e69353d3ab091a574b3da9c207912c4be9e0031e7af71920f2efd"} Mar 18 16:10:15 crc kubenswrapper[4696]: I0318 16:10:15.930410 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gskmp" event={"ID":"ee9f444d-f1ec-493e-bd45-8ac0c395dd9d","Type":"ContainerStarted","Data":"945fb57c284158d29a55c61cedee33484bfb1a70ba6d2799581a68133b6ee79e"} Mar 18 16:10:15 crc kubenswrapper[4696]: I0318 16:10:15.954393 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gskmp" podStartSLOduration=2.780416005 podStartE2EDuration="18.954375476s" podCreationTimestamp="2026-03-18 16:09:57 +0000 UTC" firstStartedPulling="2026-03-18 16:09:58.775465273 +0000 UTC m=+2041.781639469" lastFinishedPulling="2026-03-18 16:10:14.949424734 +0000 UTC m=+2057.955598940" observedRunningTime="2026-03-18 16:10:15.94577942 +0000 UTC m=+2058.951953626" watchObservedRunningTime="2026-03-18 16:10:15.954375476 +0000 UTC m=+2058.960549682" Mar 18 16:10:17 crc kubenswrapper[4696]: I0318 16:10:17.797017 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gskmp" Mar 18 16:10:17 crc kubenswrapper[4696]: I0318 16:10:17.797335 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gskmp" Mar 18 16:10:18 crc kubenswrapper[4696]: I0318 16:10:18.858621 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gskmp" podUID="ee9f444d-f1ec-493e-bd45-8ac0c395dd9d" containerName="registry-server" probeResult="failure" output=< Mar 18 16:10:18 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Mar 18 16:10:18 crc kubenswrapper[4696]: > Mar 18 16:10:19 crc kubenswrapper[4696]: I0318 16:10:19.967271 4696 generic.go:334] "Generic (PLEG): container finished" podID="3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6" containerID="4c0196b1981d64eec622b8597556c6ffce77f70b7dfec3ea00cde87612e93efb" exitCode=0 Mar 18 16:10:19 crc kubenswrapper[4696]: I0318 16:10:19.967335 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7" event={"ID":"3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6","Type":"ContainerDied","Data":"4c0196b1981d64eec622b8597556c6ffce77f70b7dfec3ea00cde87612e93efb"} Mar 18 16:10:21 crc kubenswrapper[4696]: I0318 16:10:21.464005 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7" Mar 18 16:10:21 crc kubenswrapper[4696]: I0318 16:10:21.555298 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqfwj\" (UniqueName: \"kubernetes.io/projected/3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6-kube-api-access-nqfwj\") pod \"3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6\" (UID: \"3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6\") " Mar 18 16:10:21 crc kubenswrapper[4696]: I0318 16:10:21.555427 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6-inventory\") pod \"3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6\" (UID: \"3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6\") " Mar 18 16:10:21 crc kubenswrapper[4696]: I0318 16:10:21.555461 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6-ssh-key-openstack-edpm-ipam\") pod \"3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6\" (UID: \"3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6\") " Mar 18 16:10:21 crc kubenswrapper[4696]: I0318 16:10:21.561832 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6-kube-api-access-nqfwj" (OuterVolumeSpecName: "kube-api-access-nqfwj") pod "3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6" (UID: "3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6"). InnerVolumeSpecName "kube-api-access-nqfwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:10:21 crc kubenswrapper[4696]: I0318 16:10:21.581494 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6" (UID: "3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:10:21 crc kubenswrapper[4696]: I0318 16:10:21.582544 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6-inventory" (OuterVolumeSpecName: "inventory") pod "3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6" (UID: "3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:10:21 crc kubenswrapper[4696]: I0318 16:10:21.659021 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqfwj\" (UniqueName: \"kubernetes.io/projected/3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6-kube-api-access-nqfwj\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:21 crc kubenswrapper[4696]: I0318 16:10:21.659055 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:21 crc kubenswrapper[4696]: I0318 16:10:21.659065 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:21 crc kubenswrapper[4696]: I0318 16:10:21.984987 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7" event={"ID":"3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6","Type":"ContainerDied","Data":"119a81aec1156dcaccef277e5391c1dcd067ecd39ded4a4473d8ed230e107b51"} Mar 18 16:10:21 crc kubenswrapper[4696]: I0318 16:10:21.985020 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="119a81aec1156dcaccef277e5391c1dcd067ecd39ded4a4473d8ed230e107b51" Mar 18 16:10:21 crc kubenswrapper[4696]: I0318 16:10:21.985045 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7" Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.065742 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-v47nc"] Mar 18 16:10:22 crc kubenswrapper[4696]: E0318 16:10:22.066125 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="235ffb53-2fcd-4d61-8f87-f5a687615356" containerName="oc" Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.066141 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="235ffb53-2fcd-4d61-8f87-f5a687615356" containerName="oc" Mar 18 16:10:22 crc kubenswrapper[4696]: E0318 16:10:22.066166 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.066174 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.066335 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.066354 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="235ffb53-2fcd-4d61-8f87-f5a687615356" containerName="oc" Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.067143 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-v47nc" Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.068936 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.069429 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.072300 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.072459 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g8zsj" Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.094788 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-v47nc"] Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.168425 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmcb2\" (UniqueName: \"kubernetes.io/projected/3460c0d4-77fe-49fd-a525-52b831bf4ff6-kube-api-access-pmcb2\") pod \"ssh-known-hosts-edpm-deployment-v47nc\" (UID: \"3460c0d4-77fe-49fd-a525-52b831bf4ff6\") " pod="openstack/ssh-known-hosts-edpm-deployment-v47nc" Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.168554 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3460c0d4-77fe-49fd-a525-52b831bf4ff6-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-v47nc\" (UID: \"3460c0d4-77fe-49fd-a525-52b831bf4ff6\") " pod="openstack/ssh-known-hosts-edpm-deployment-v47nc" Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.168586 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3460c0d4-77fe-49fd-a525-52b831bf4ff6-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-v47nc\" (UID: \"3460c0d4-77fe-49fd-a525-52b831bf4ff6\") " pod="openstack/ssh-known-hosts-edpm-deployment-v47nc" Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.270801 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3460c0d4-77fe-49fd-a525-52b831bf4ff6-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-v47nc\" (UID: \"3460c0d4-77fe-49fd-a525-52b831bf4ff6\") " pod="openstack/ssh-known-hosts-edpm-deployment-v47nc" Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.271077 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmcb2\" (UniqueName: \"kubernetes.io/projected/3460c0d4-77fe-49fd-a525-52b831bf4ff6-kube-api-access-pmcb2\") pod \"ssh-known-hosts-edpm-deployment-v47nc\" (UID: \"3460c0d4-77fe-49fd-a525-52b831bf4ff6\") " pod="openstack/ssh-known-hosts-edpm-deployment-v47nc" Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.271235 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3460c0d4-77fe-49fd-a525-52b831bf4ff6-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-v47nc\" (UID: \"3460c0d4-77fe-49fd-a525-52b831bf4ff6\") " pod="openstack/ssh-known-hosts-edpm-deployment-v47nc" Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.275583 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3460c0d4-77fe-49fd-a525-52b831bf4ff6-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-v47nc\" (UID: \"3460c0d4-77fe-49fd-a525-52b831bf4ff6\") " pod="openstack/ssh-known-hosts-edpm-deployment-v47nc" Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.278997 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3460c0d4-77fe-49fd-a525-52b831bf4ff6-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-v47nc\" (UID: \"3460c0d4-77fe-49fd-a525-52b831bf4ff6\") " pod="openstack/ssh-known-hosts-edpm-deployment-v47nc" Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.288108 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmcb2\" (UniqueName: \"kubernetes.io/projected/3460c0d4-77fe-49fd-a525-52b831bf4ff6-kube-api-access-pmcb2\") pod \"ssh-known-hosts-edpm-deployment-v47nc\" (UID: \"3460c0d4-77fe-49fd-a525-52b831bf4ff6\") " pod="openstack/ssh-known-hosts-edpm-deployment-v47nc" Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.390003 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-v47nc" Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.465615 4696 scope.go:117] "RemoveContainer" containerID="7607dc7f51f311add0fa7e16a6fe28fb14a30b35f32f5fdddddba2076e9cf6b8" Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.547002 4696 scope.go:117] "RemoveContainer" containerID="d92ae8dabbaaa98864ddff9e17467d0e4796962852a20add4f2473e5772709b1" Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.618636 4696 scope.go:117] "RemoveContainer" containerID="5b6295e623d9882ee5c1a781ea10e6bca407d26b413bc2def8ab58f02cc88f98" Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.943731 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-v47nc"] Mar 18 16:10:22 crc kubenswrapper[4696]: W0318 16:10:22.947680 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3460c0d4_77fe_49fd_a525_52b831bf4ff6.slice/crio-56d994a2bb98b5cd5d460894f402e590a5a576d7f5a3232d19d44ab5a669bfb8 WatchSource:0}: Error finding container 56d994a2bb98b5cd5d460894f402e590a5a576d7f5a3232d19d44ab5a669bfb8: Status 404 returned error can't find the container with id 56d994a2bb98b5cd5d460894f402e590a5a576d7f5a3232d19d44ab5a669bfb8 Mar 18 16:10:22 crc kubenswrapper[4696]: I0318 16:10:22.995792 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-v47nc" event={"ID":"3460c0d4-77fe-49fd-a525-52b831bf4ff6","Type":"ContainerStarted","Data":"56d994a2bb98b5cd5d460894f402e590a5a576d7f5a3232d19d44ab5a669bfb8"} Mar 18 16:10:26 crc kubenswrapper[4696]: I0318 16:10:26.020888 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-v47nc" event={"ID":"3460c0d4-77fe-49fd-a525-52b831bf4ff6","Type":"ContainerStarted","Data":"2d71acc9d123c2169f063b6f99ac96fafc516de4077c2347e13e0b75fdfaa729"} Mar 18 16:10:26 crc kubenswrapper[4696]: I0318 16:10:26.049068 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-v47nc" podStartSLOduration=1.949457485 podStartE2EDuration="4.04905301s" podCreationTimestamp="2026-03-18 16:10:22 +0000 UTC" firstStartedPulling="2026-03-18 16:10:22.950086633 +0000 UTC m=+2065.956260839" lastFinishedPulling="2026-03-18 16:10:25.049682158 +0000 UTC m=+2068.055856364" observedRunningTime="2026-03-18 16:10:26.043013558 +0000 UTC m=+2069.049187764" watchObservedRunningTime="2026-03-18 16:10:26.04905301 +0000 UTC m=+2069.055227216" Mar 18 16:10:28 crc kubenswrapper[4696]: I0318 16:10:28.849424 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gskmp" podUID="ee9f444d-f1ec-493e-bd45-8ac0c395dd9d" containerName="registry-server" probeResult="failure" output=< Mar 18 16:10:28 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Mar 18 16:10:28 crc kubenswrapper[4696]: > Mar 18 16:10:33 crc kubenswrapper[4696]: I0318 16:10:33.092744 4696 generic.go:334] "Generic (PLEG): container finished" podID="3460c0d4-77fe-49fd-a525-52b831bf4ff6" containerID="2d71acc9d123c2169f063b6f99ac96fafc516de4077c2347e13e0b75fdfaa729" exitCode=0 Mar 18 16:10:33 crc kubenswrapper[4696]: I0318 16:10:33.092866 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-v47nc" event={"ID":"3460c0d4-77fe-49fd-a525-52b831bf4ff6","Type":"ContainerDied","Data":"2d71acc9d123c2169f063b6f99ac96fafc516de4077c2347e13e0b75fdfaa729"} Mar 18 16:10:34 crc kubenswrapper[4696]: I0318 16:10:34.517185 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-v47nc" Mar 18 16:10:34 crc kubenswrapper[4696]: I0318 16:10:34.613358 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3460c0d4-77fe-49fd-a525-52b831bf4ff6-inventory-0\") pod \"3460c0d4-77fe-49fd-a525-52b831bf4ff6\" (UID: \"3460c0d4-77fe-49fd-a525-52b831bf4ff6\") " Mar 18 16:10:34 crc kubenswrapper[4696]: I0318 16:10:34.613422 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3460c0d4-77fe-49fd-a525-52b831bf4ff6-ssh-key-openstack-edpm-ipam\") pod \"3460c0d4-77fe-49fd-a525-52b831bf4ff6\" (UID: \"3460c0d4-77fe-49fd-a525-52b831bf4ff6\") " Mar 18 16:10:34 crc kubenswrapper[4696]: I0318 16:10:34.613462 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmcb2\" (UniqueName: \"kubernetes.io/projected/3460c0d4-77fe-49fd-a525-52b831bf4ff6-kube-api-access-pmcb2\") pod \"3460c0d4-77fe-49fd-a525-52b831bf4ff6\" (UID: \"3460c0d4-77fe-49fd-a525-52b831bf4ff6\") " Mar 18 16:10:34 crc kubenswrapper[4696]: I0318 16:10:34.640750 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3460c0d4-77fe-49fd-a525-52b831bf4ff6-kube-api-access-pmcb2" (OuterVolumeSpecName: "kube-api-access-pmcb2") pod "3460c0d4-77fe-49fd-a525-52b831bf4ff6" (UID: "3460c0d4-77fe-49fd-a525-52b831bf4ff6"). InnerVolumeSpecName "kube-api-access-pmcb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:10:34 crc kubenswrapper[4696]: I0318 16:10:34.670737 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3460c0d4-77fe-49fd-a525-52b831bf4ff6-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "3460c0d4-77fe-49fd-a525-52b831bf4ff6" (UID: "3460c0d4-77fe-49fd-a525-52b831bf4ff6"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:10:34 crc kubenswrapper[4696]: I0318 16:10:34.670754 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3460c0d4-77fe-49fd-a525-52b831bf4ff6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3460c0d4-77fe-49fd-a525-52b831bf4ff6" (UID: "3460c0d4-77fe-49fd-a525-52b831bf4ff6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:10:34 crc kubenswrapper[4696]: I0318 16:10:34.716920 4696 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3460c0d4-77fe-49fd-a525-52b831bf4ff6-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:34 crc kubenswrapper[4696]: I0318 16:10:34.716966 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3460c0d4-77fe-49fd-a525-52b831bf4ff6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:34 crc kubenswrapper[4696]: I0318 16:10:34.716979 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmcb2\" (UniqueName: \"kubernetes.io/projected/3460c0d4-77fe-49fd-a525-52b831bf4ff6-kube-api-access-pmcb2\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:35 crc kubenswrapper[4696]: I0318 16:10:35.113390 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-v47nc" event={"ID":"3460c0d4-77fe-49fd-a525-52b831bf4ff6","Type":"ContainerDied","Data":"56d994a2bb98b5cd5d460894f402e590a5a576d7f5a3232d19d44ab5a669bfb8"} Mar 18 16:10:35 crc kubenswrapper[4696]: I0318 16:10:35.113676 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56d994a2bb98b5cd5d460894f402e590a5a576d7f5a3232d19d44ab5a669bfb8" Mar 18 16:10:35 crc kubenswrapper[4696]: I0318 16:10:35.113630 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-v47nc" Mar 18 16:10:35 crc kubenswrapper[4696]: I0318 16:10:35.201732 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rshr7"] Mar 18 16:10:35 crc kubenswrapper[4696]: E0318 16:10:35.202139 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3460c0d4-77fe-49fd-a525-52b831bf4ff6" containerName="ssh-known-hosts-edpm-deployment" Mar 18 16:10:35 crc kubenswrapper[4696]: I0318 16:10:35.202164 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3460c0d4-77fe-49fd-a525-52b831bf4ff6" containerName="ssh-known-hosts-edpm-deployment" Mar 18 16:10:35 crc kubenswrapper[4696]: I0318 16:10:35.202398 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="3460c0d4-77fe-49fd-a525-52b831bf4ff6" containerName="ssh-known-hosts-edpm-deployment" Mar 18 16:10:35 crc kubenswrapper[4696]: I0318 16:10:35.203175 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rshr7" Mar 18 16:10:35 crc kubenswrapper[4696]: I0318 16:10:35.205468 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g8zsj" Mar 18 16:10:35 crc kubenswrapper[4696]: I0318 16:10:35.205738 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:10:35 crc kubenswrapper[4696]: I0318 16:10:35.205874 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:10:35 crc kubenswrapper[4696]: I0318 16:10:35.206232 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:10:35 crc kubenswrapper[4696]: I0318 16:10:35.213991 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rshr7"] Mar 18 16:10:35 crc kubenswrapper[4696]: I0318 16:10:35.268884 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/174da10e-47cb-4e7a-8226-e7a4baeaf2ac-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rshr7\" (UID: \"174da10e-47cb-4e7a-8226-e7a4baeaf2ac\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rshr7" Mar 18 16:10:35 crc kubenswrapper[4696]: I0318 16:10:35.268935 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89px4\" (UniqueName: \"kubernetes.io/projected/174da10e-47cb-4e7a-8226-e7a4baeaf2ac-kube-api-access-89px4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rshr7\" (UID: \"174da10e-47cb-4e7a-8226-e7a4baeaf2ac\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rshr7" Mar 18 16:10:35 crc kubenswrapper[4696]: I0318 16:10:35.269062 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/174da10e-47cb-4e7a-8226-e7a4baeaf2ac-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rshr7\" (UID: \"174da10e-47cb-4e7a-8226-e7a4baeaf2ac\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rshr7" Mar 18 16:10:35 crc kubenswrapper[4696]: I0318 16:10:35.370404 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/174da10e-47cb-4e7a-8226-e7a4baeaf2ac-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rshr7\" (UID: \"174da10e-47cb-4e7a-8226-e7a4baeaf2ac\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rshr7" Mar 18 16:10:35 crc kubenswrapper[4696]: I0318 16:10:35.370490 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/174da10e-47cb-4e7a-8226-e7a4baeaf2ac-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rshr7\" (UID: \"174da10e-47cb-4e7a-8226-e7a4baeaf2ac\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rshr7" Mar 18 16:10:35 crc kubenswrapper[4696]: I0318 16:10:35.370559 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89px4\" (UniqueName: \"kubernetes.io/projected/174da10e-47cb-4e7a-8226-e7a4baeaf2ac-kube-api-access-89px4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rshr7\" (UID: \"174da10e-47cb-4e7a-8226-e7a4baeaf2ac\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rshr7" Mar 18 16:10:35 crc kubenswrapper[4696]: I0318 16:10:35.377377 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/174da10e-47cb-4e7a-8226-e7a4baeaf2ac-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rshr7\" (UID: \"174da10e-47cb-4e7a-8226-e7a4baeaf2ac\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rshr7" Mar 18 16:10:35 crc kubenswrapper[4696]: I0318 16:10:35.381899 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/174da10e-47cb-4e7a-8226-e7a4baeaf2ac-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rshr7\" (UID: \"174da10e-47cb-4e7a-8226-e7a4baeaf2ac\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rshr7" Mar 18 16:10:35 crc kubenswrapper[4696]: I0318 16:10:35.388979 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89px4\" (UniqueName: \"kubernetes.io/projected/174da10e-47cb-4e7a-8226-e7a4baeaf2ac-kube-api-access-89px4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rshr7\" (UID: \"174da10e-47cb-4e7a-8226-e7a4baeaf2ac\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rshr7" Mar 18 16:10:35 crc kubenswrapper[4696]: I0318 16:10:35.577046 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rshr7" Mar 18 16:10:36 crc kubenswrapper[4696]: I0318 16:10:36.210240 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rshr7"] Mar 18 16:10:37 crc kubenswrapper[4696]: I0318 16:10:37.136188 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rshr7" event={"ID":"174da10e-47cb-4e7a-8226-e7a4baeaf2ac","Type":"ContainerStarted","Data":"8465b640fa1adcd7e6e55c61ea318d203c09563ca7ba6ed8412d2b6158bb3231"} Mar 18 16:10:37 crc kubenswrapper[4696]: I0318 16:10:37.137006 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rshr7" event={"ID":"174da10e-47cb-4e7a-8226-e7a4baeaf2ac","Type":"ContainerStarted","Data":"b76952d67c484c3b9591e13a7fdd1f1e1c270445380858e01a0ed11d24765cde"} Mar 18 16:10:37 crc kubenswrapper[4696]: I0318 16:10:37.161840 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rshr7" podStartSLOduration=1.7536761090000001 podStartE2EDuration="2.161818383s" podCreationTimestamp="2026-03-18 16:10:35 +0000 UTC" firstStartedPulling="2026-03-18 16:10:36.223187212 +0000 UTC m=+2079.229361408" lastFinishedPulling="2026-03-18 16:10:36.631329466 +0000 UTC m=+2079.637503682" observedRunningTime="2026-03-18 16:10:37.157579276 +0000 UTC m=+2080.163753502" watchObservedRunningTime="2026-03-18 16:10:37.161818383 +0000 UTC m=+2080.167992589" Mar 18 16:10:38 crc kubenswrapper[4696]: I0318 16:10:38.846632 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gskmp" podUID="ee9f444d-f1ec-493e-bd45-8ac0c395dd9d" containerName="registry-server" probeResult="failure" output=< Mar 18 16:10:38 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Mar 18 16:10:38 crc kubenswrapper[4696]: > Mar 18 16:10:42 crc kubenswrapper[4696]: I0318 16:10:42.184614 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:10:42 crc kubenswrapper[4696]: I0318 16:10:42.185171 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:10:45 crc kubenswrapper[4696]: I0318 16:10:45.045110 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-vtbst"] Mar 18 16:10:45 crc kubenswrapper[4696]: I0318 16:10:45.055950 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-vtbst"] Mar 18 16:10:45 crc kubenswrapper[4696]: I0318 16:10:45.199473 4696 generic.go:334] "Generic (PLEG): container finished" podID="174da10e-47cb-4e7a-8226-e7a4baeaf2ac" containerID="8465b640fa1adcd7e6e55c61ea318d203c09563ca7ba6ed8412d2b6158bb3231" exitCode=0 Mar 18 16:10:45 crc kubenswrapper[4696]: I0318 16:10:45.199534 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rshr7" event={"ID":"174da10e-47cb-4e7a-8226-e7a4baeaf2ac","Type":"ContainerDied","Data":"8465b640fa1adcd7e6e55c61ea318d203c09563ca7ba6ed8412d2b6158bb3231"} Mar 18 16:10:45 crc kubenswrapper[4696]: I0318 16:10:45.614096 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99d77d81-5c38-41dd-818b-103a8059e1f9" path="/var/lib/kubelet/pods/99d77d81-5c38-41dd-818b-103a8059e1f9/volumes" Mar 18 16:10:46 crc kubenswrapper[4696]: I0318 16:10:46.609992 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rshr7" Mar 18 16:10:46 crc kubenswrapper[4696]: I0318 16:10:46.699302 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89px4\" (UniqueName: \"kubernetes.io/projected/174da10e-47cb-4e7a-8226-e7a4baeaf2ac-kube-api-access-89px4\") pod \"174da10e-47cb-4e7a-8226-e7a4baeaf2ac\" (UID: \"174da10e-47cb-4e7a-8226-e7a4baeaf2ac\") " Mar 18 16:10:46 crc kubenswrapper[4696]: I0318 16:10:46.699386 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/174da10e-47cb-4e7a-8226-e7a4baeaf2ac-ssh-key-openstack-edpm-ipam\") pod \"174da10e-47cb-4e7a-8226-e7a4baeaf2ac\" (UID: \"174da10e-47cb-4e7a-8226-e7a4baeaf2ac\") " Mar 18 16:10:46 crc kubenswrapper[4696]: I0318 16:10:46.699590 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/174da10e-47cb-4e7a-8226-e7a4baeaf2ac-inventory\") pod \"174da10e-47cb-4e7a-8226-e7a4baeaf2ac\" (UID: \"174da10e-47cb-4e7a-8226-e7a4baeaf2ac\") " Mar 18 16:10:46 crc kubenswrapper[4696]: I0318 16:10:46.706589 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/174da10e-47cb-4e7a-8226-e7a4baeaf2ac-kube-api-access-89px4" (OuterVolumeSpecName: "kube-api-access-89px4") pod "174da10e-47cb-4e7a-8226-e7a4baeaf2ac" (UID: "174da10e-47cb-4e7a-8226-e7a4baeaf2ac"). InnerVolumeSpecName "kube-api-access-89px4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:10:46 crc kubenswrapper[4696]: I0318 16:10:46.732131 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/174da10e-47cb-4e7a-8226-e7a4baeaf2ac-inventory" (OuterVolumeSpecName: "inventory") pod "174da10e-47cb-4e7a-8226-e7a4baeaf2ac" (UID: "174da10e-47cb-4e7a-8226-e7a4baeaf2ac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:10:46 crc kubenswrapper[4696]: I0318 16:10:46.743666 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/174da10e-47cb-4e7a-8226-e7a4baeaf2ac-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "174da10e-47cb-4e7a-8226-e7a4baeaf2ac" (UID: "174da10e-47cb-4e7a-8226-e7a4baeaf2ac"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:10:46 crc kubenswrapper[4696]: I0318 16:10:46.801628 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/174da10e-47cb-4e7a-8226-e7a4baeaf2ac-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:46 crc kubenswrapper[4696]: I0318 16:10:46.801671 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89px4\" (UniqueName: \"kubernetes.io/projected/174da10e-47cb-4e7a-8226-e7a4baeaf2ac-kube-api-access-89px4\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:46 crc kubenswrapper[4696]: I0318 16:10:46.801685 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/174da10e-47cb-4e7a-8226-e7a4baeaf2ac-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:47 crc kubenswrapper[4696]: I0318 16:10:47.217927 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rshr7" event={"ID":"174da10e-47cb-4e7a-8226-e7a4baeaf2ac","Type":"ContainerDied","Data":"b76952d67c484c3b9591e13a7fdd1f1e1c270445380858e01a0ed11d24765cde"} Mar 18 16:10:47 crc kubenswrapper[4696]: I0318 16:10:47.217973 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b76952d67c484c3b9591e13a7fdd1f1e1c270445380858e01a0ed11d24765cde" Mar 18 16:10:47 crc kubenswrapper[4696]: I0318 16:10:47.218025 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rshr7" Mar 18 16:10:47 crc kubenswrapper[4696]: I0318 16:10:47.294351 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg"] Mar 18 16:10:47 crc kubenswrapper[4696]: E0318 16:10:47.294967 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="174da10e-47cb-4e7a-8226-e7a4baeaf2ac" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 16:10:47 crc kubenswrapper[4696]: I0318 16:10:47.295002 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="174da10e-47cb-4e7a-8226-e7a4baeaf2ac" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 16:10:47 crc kubenswrapper[4696]: I0318 16:10:47.295391 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="174da10e-47cb-4e7a-8226-e7a4baeaf2ac" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 16:10:47 crc kubenswrapper[4696]: I0318 16:10:47.296501 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg" Mar 18 16:10:47 crc kubenswrapper[4696]: I0318 16:10:47.299600 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:10:47 crc kubenswrapper[4696]: I0318 16:10:47.299633 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:10:47 crc kubenswrapper[4696]: I0318 16:10:47.299795 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:10:47 crc kubenswrapper[4696]: I0318 16:10:47.300943 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g8zsj" Mar 18 16:10:47 crc kubenswrapper[4696]: I0318 16:10:47.303143 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg"] Mar 18 16:10:47 crc kubenswrapper[4696]: I0318 16:10:47.411683 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43fbb202-ffe4-40ba-b61e-ea284e533c1f-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg\" (UID: \"43fbb202-ffe4-40ba-b61e-ea284e533c1f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg" Mar 18 16:10:47 crc kubenswrapper[4696]: I0318 16:10:47.411846 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-249pp\" (UniqueName: \"kubernetes.io/projected/43fbb202-ffe4-40ba-b61e-ea284e533c1f-kube-api-access-249pp\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg\" (UID: \"43fbb202-ffe4-40ba-b61e-ea284e533c1f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg" Mar 18 16:10:47 crc kubenswrapper[4696]: I0318 16:10:47.411919 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43fbb202-ffe4-40ba-b61e-ea284e533c1f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg\" (UID: \"43fbb202-ffe4-40ba-b61e-ea284e533c1f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg" Mar 18 16:10:47 crc kubenswrapper[4696]: I0318 16:10:47.513709 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-249pp\" (UniqueName: \"kubernetes.io/projected/43fbb202-ffe4-40ba-b61e-ea284e533c1f-kube-api-access-249pp\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg\" (UID: \"43fbb202-ffe4-40ba-b61e-ea284e533c1f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg" Mar 18 16:10:47 crc kubenswrapper[4696]: I0318 16:10:47.513801 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43fbb202-ffe4-40ba-b61e-ea284e533c1f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg\" (UID: \"43fbb202-ffe4-40ba-b61e-ea284e533c1f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg" Mar 18 16:10:47 crc kubenswrapper[4696]: I0318 16:10:47.513871 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43fbb202-ffe4-40ba-b61e-ea284e533c1f-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg\" (UID: \"43fbb202-ffe4-40ba-b61e-ea284e533c1f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg" Mar 18 16:10:47 crc kubenswrapper[4696]: I0318 16:10:47.517674 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43fbb202-ffe4-40ba-b61e-ea284e533c1f-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg\" (UID: \"43fbb202-ffe4-40ba-b61e-ea284e533c1f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg" Mar 18 16:10:47 crc kubenswrapper[4696]: I0318 16:10:47.518989 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43fbb202-ffe4-40ba-b61e-ea284e533c1f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg\" (UID: \"43fbb202-ffe4-40ba-b61e-ea284e533c1f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg" Mar 18 16:10:47 crc kubenswrapper[4696]: I0318 16:10:47.535767 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-249pp\" (UniqueName: \"kubernetes.io/projected/43fbb202-ffe4-40ba-b61e-ea284e533c1f-kube-api-access-249pp\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg\" (UID: \"43fbb202-ffe4-40ba-b61e-ea284e533c1f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg" Mar 18 16:10:47 crc kubenswrapper[4696]: I0318 16:10:47.612986 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg" Mar 18 16:10:48 crc kubenswrapper[4696]: I0318 16:10:48.222785 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg"] Mar 18 16:10:48 crc kubenswrapper[4696]: I0318 16:10:48.858686 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gskmp" podUID="ee9f444d-f1ec-493e-bd45-8ac0c395dd9d" containerName="registry-server" probeResult="failure" output=< Mar 18 16:10:48 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Mar 18 16:10:48 crc kubenswrapper[4696]: > Mar 18 16:10:49 crc kubenswrapper[4696]: I0318 16:10:49.242537 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg" event={"ID":"43fbb202-ffe4-40ba-b61e-ea284e533c1f","Type":"ContainerStarted","Data":"95daaabfbcec3d34cd6240e336a79b0a8446e1a6db6da91d8eb8c9959c8f963a"} Mar 18 16:10:50 crc kubenswrapper[4696]: I0318 16:10:50.263014 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg" event={"ID":"43fbb202-ffe4-40ba-b61e-ea284e533c1f","Type":"ContainerStarted","Data":"949fb85faf7e7a9f169a454522e633a9983dfca5223d814876b9908d83d3633c"} Mar 18 16:10:50 crc kubenswrapper[4696]: I0318 16:10:50.283188 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg" podStartSLOduration=2.455595599 podStartE2EDuration="3.283171303s" podCreationTimestamp="2026-03-18 16:10:47 +0000 UTC" firstStartedPulling="2026-03-18 16:10:48.236508712 +0000 UTC m=+2091.242682918" lastFinishedPulling="2026-03-18 16:10:49.064084416 +0000 UTC m=+2092.070258622" observedRunningTime="2026-03-18 16:10:50.276184157 +0000 UTC m=+2093.282358363" watchObservedRunningTime="2026-03-18 16:10:50.283171303 +0000 UTC m=+2093.289345509" Mar 18 16:10:57 crc kubenswrapper[4696]: I0318 16:10:57.850626 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gskmp" Mar 18 16:10:57 crc kubenswrapper[4696]: I0318 16:10:57.904484 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gskmp" Mar 18 16:10:58 crc kubenswrapper[4696]: I0318 16:10:58.338639 4696 generic.go:334] "Generic (PLEG): container finished" podID="43fbb202-ffe4-40ba-b61e-ea284e533c1f" containerID="949fb85faf7e7a9f169a454522e633a9983dfca5223d814876b9908d83d3633c" exitCode=0 Mar 18 16:10:58 crc kubenswrapper[4696]: I0318 16:10:58.338726 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg" event={"ID":"43fbb202-ffe4-40ba-b61e-ea284e533c1f","Type":"ContainerDied","Data":"949fb85faf7e7a9f169a454522e633a9983dfca5223d814876b9908d83d3633c"} Mar 18 16:10:59 crc kubenswrapper[4696]: I0318 16:10:59.740229 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg" Mar 18 16:10:59 crc kubenswrapper[4696]: I0318 16:10:59.858781 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249pp\" (UniqueName: \"kubernetes.io/projected/43fbb202-ffe4-40ba-b61e-ea284e533c1f-kube-api-access-249pp\") pod \"43fbb202-ffe4-40ba-b61e-ea284e533c1f\" (UID: \"43fbb202-ffe4-40ba-b61e-ea284e533c1f\") " Mar 18 16:10:59 crc kubenswrapper[4696]: I0318 16:10:59.858948 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43fbb202-ffe4-40ba-b61e-ea284e533c1f-ssh-key-openstack-edpm-ipam\") pod \"43fbb202-ffe4-40ba-b61e-ea284e533c1f\" (UID: \"43fbb202-ffe4-40ba-b61e-ea284e533c1f\") " Mar 18 16:10:59 crc kubenswrapper[4696]: I0318 16:10:59.860106 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43fbb202-ffe4-40ba-b61e-ea284e533c1f-inventory\") pod \"43fbb202-ffe4-40ba-b61e-ea284e533c1f\" (UID: \"43fbb202-ffe4-40ba-b61e-ea284e533c1f\") " Mar 18 16:10:59 crc kubenswrapper[4696]: I0318 16:10:59.864743 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43fbb202-ffe4-40ba-b61e-ea284e533c1f-kube-api-access-249pp" (OuterVolumeSpecName: "kube-api-access-249pp") pod "43fbb202-ffe4-40ba-b61e-ea284e533c1f" (UID: "43fbb202-ffe4-40ba-b61e-ea284e533c1f"). InnerVolumeSpecName "kube-api-access-249pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:10:59 crc kubenswrapper[4696]: I0318 16:10:59.879677 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gskmp"] Mar 18 16:10:59 crc kubenswrapper[4696]: I0318 16:10:59.920550 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43fbb202-ffe4-40ba-b61e-ea284e533c1f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "43fbb202-ffe4-40ba-b61e-ea284e533c1f" (UID: "43fbb202-ffe4-40ba-b61e-ea284e533c1f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:10:59 crc kubenswrapper[4696]: I0318 16:10:59.920636 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43fbb202-ffe4-40ba-b61e-ea284e533c1f-inventory" (OuterVolumeSpecName: "inventory") pod "43fbb202-ffe4-40ba-b61e-ea284e533c1f" (UID: "43fbb202-ffe4-40ba-b61e-ea284e533c1f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:10:59 crc kubenswrapper[4696]: I0318 16:10:59.962597 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249pp\" (UniqueName: \"kubernetes.io/projected/43fbb202-ffe4-40ba-b61e-ea284e533c1f-kube-api-access-249pp\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:59 crc kubenswrapper[4696]: I0318 16:10:59.962628 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43fbb202-ffe4-40ba-b61e-ea284e533c1f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:59 crc kubenswrapper[4696]: I0318 16:10:59.962640 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43fbb202-ffe4-40ba-b61e-ea284e533c1f-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.254664 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gxwqj"] Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.255248 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gxwqj" podUID="cee4651c-7e34-42f9-bb81-9537803fa622" containerName="registry-server" containerID="cri-o://6b129f5ed6bef1fd2d57f01763d905d4ceb7cfd0e6a1746fde7e849a32932642" gracePeriod=2 Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.361608 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg" event={"ID":"43fbb202-ffe4-40ba-b61e-ea284e533c1f","Type":"ContainerDied","Data":"95daaabfbcec3d34cd6240e336a79b0a8446e1a6db6da91d8eb8c9959c8f963a"} Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.361645 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95daaabfbcec3d34cd6240e336a79b0a8446e1a6db6da91d8eb8c9959c8f963a" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.361696 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.463104 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg"] Mar 18 16:11:00 crc kubenswrapper[4696]: E0318 16:11:00.463847 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43fbb202-ffe4-40ba-b61e-ea284e533c1f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.464307 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="43fbb202-ffe4-40ba-b61e-ea284e533c1f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.477467 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="43fbb202-ffe4-40ba-b61e-ea284e533c1f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.478294 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg"] Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.478401 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.480424 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.480727 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.482733 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g8zsj" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.483185 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.483496 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.484564 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.485558 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.490338 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.575364 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.575536 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.575633 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.575670 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.575705 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.575722 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.575748 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.575791 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.575819 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.575893 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.575946 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.575963 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.576017 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drjbc\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-kube-api-access-drjbc\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.576074 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.677460 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.677752 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.677904 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drjbc\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-kube-api-access-drjbc\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.678036 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.678194 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.678301 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.678414 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.678536 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.678661 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.678766 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.678873 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.679009 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.679131 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.679280 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.684420 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.685224 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.687059 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.687582 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.687908 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.688381 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.688781 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.689023 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.689124 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.690708 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.692972 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.687934 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.694780 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.699982 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drjbc\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-kube-api-access-drjbc\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27gg\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.793163 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.796583 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gxwqj" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.884792 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cee4651c-7e34-42f9-bb81-9537803fa622-catalog-content\") pod \"cee4651c-7e34-42f9-bb81-9537803fa622\" (UID: \"cee4651c-7e34-42f9-bb81-9537803fa622\") " Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.884887 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xmp2\" (UniqueName: \"kubernetes.io/projected/cee4651c-7e34-42f9-bb81-9537803fa622-kube-api-access-6xmp2\") pod \"cee4651c-7e34-42f9-bb81-9537803fa622\" (UID: \"cee4651c-7e34-42f9-bb81-9537803fa622\") " Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.885332 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cee4651c-7e34-42f9-bb81-9537803fa622-utilities\") pod \"cee4651c-7e34-42f9-bb81-9537803fa622\" (UID: \"cee4651c-7e34-42f9-bb81-9537803fa622\") " Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.886402 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cee4651c-7e34-42f9-bb81-9537803fa622-utilities" (OuterVolumeSpecName: "utilities") pod "cee4651c-7e34-42f9-bb81-9537803fa622" (UID: "cee4651c-7e34-42f9-bb81-9537803fa622"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.886759 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cee4651c-7e34-42f9-bb81-9537803fa622-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.888825 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cee4651c-7e34-42f9-bb81-9537803fa622-kube-api-access-6xmp2" (OuterVolumeSpecName: "kube-api-access-6xmp2") pod "cee4651c-7e34-42f9-bb81-9537803fa622" (UID: "cee4651c-7e34-42f9-bb81-9537803fa622"). InnerVolumeSpecName "kube-api-access-6xmp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:11:00 crc kubenswrapper[4696]: I0318 16:11:00.989954 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xmp2\" (UniqueName: \"kubernetes.io/projected/cee4651c-7e34-42f9-bb81-9537803fa622-kube-api-access-6xmp2\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:01 crc kubenswrapper[4696]: I0318 16:11:01.030216 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cee4651c-7e34-42f9-bb81-9537803fa622-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cee4651c-7e34-42f9-bb81-9537803fa622" (UID: "cee4651c-7e34-42f9-bb81-9537803fa622"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:11:01 crc kubenswrapper[4696]: I0318 16:11:01.091849 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cee4651c-7e34-42f9-bb81-9537803fa622-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:01 crc kubenswrapper[4696]: I0318 16:11:01.371166 4696 generic.go:334] "Generic (PLEG): container finished" podID="cee4651c-7e34-42f9-bb81-9537803fa622" containerID="6b129f5ed6bef1fd2d57f01763d905d4ceb7cfd0e6a1746fde7e849a32932642" exitCode=0 Mar 18 16:11:01 crc kubenswrapper[4696]: I0318 16:11:01.371218 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxwqj" event={"ID":"cee4651c-7e34-42f9-bb81-9537803fa622","Type":"ContainerDied","Data":"6b129f5ed6bef1fd2d57f01763d905d4ceb7cfd0e6a1746fde7e849a32932642"} Mar 18 16:11:01 crc kubenswrapper[4696]: I0318 16:11:01.371251 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxwqj" event={"ID":"cee4651c-7e34-42f9-bb81-9537803fa622","Type":"ContainerDied","Data":"c84647d097619702e7e00b1e786d0990520c302cf29aac31dee929e75b548639"} Mar 18 16:11:01 crc kubenswrapper[4696]: I0318 16:11:01.371272 4696 scope.go:117] "RemoveContainer" containerID="6b129f5ed6bef1fd2d57f01763d905d4ceb7cfd0e6a1746fde7e849a32932642" Mar 18 16:11:01 crc kubenswrapper[4696]: I0318 16:11:01.371435 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gxwqj" Mar 18 16:11:01 crc kubenswrapper[4696]: I0318 16:11:01.404672 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gxwqj"] Mar 18 16:11:01 crc kubenswrapper[4696]: I0318 16:11:01.416245 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gxwqj"] Mar 18 16:11:01 crc kubenswrapper[4696]: I0318 16:11:01.417015 4696 scope.go:117] "RemoveContainer" containerID="a05d70feae69385aed64cf9538cbc5ba20c36ff65429f7c0ef2eb919ceb5bc17" Mar 18 16:11:01 crc kubenswrapper[4696]: I0318 16:11:01.437826 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg"] Mar 18 16:11:01 crc kubenswrapper[4696]: I0318 16:11:01.437954 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:11:01 crc kubenswrapper[4696]: I0318 16:11:01.456238 4696 scope.go:117] "RemoveContainer" containerID="7c3bc34bfb15f69c948d069c4d17877d2d9f69a84712c8d047d2a694b9cf4e25" Mar 18 16:11:01 crc kubenswrapper[4696]: I0318 16:11:01.478725 4696 scope.go:117] "RemoveContainer" containerID="6b129f5ed6bef1fd2d57f01763d905d4ceb7cfd0e6a1746fde7e849a32932642" Mar 18 16:11:01 crc kubenswrapper[4696]: E0318 16:11:01.479063 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b129f5ed6bef1fd2d57f01763d905d4ceb7cfd0e6a1746fde7e849a32932642\": container with ID starting with 6b129f5ed6bef1fd2d57f01763d905d4ceb7cfd0e6a1746fde7e849a32932642 not found: ID does not exist" containerID="6b129f5ed6bef1fd2d57f01763d905d4ceb7cfd0e6a1746fde7e849a32932642" Mar 18 16:11:01 crc kubenswrapper[4696]: I0318 16:11:01.479089 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b129f5ed6bef1fd2d57f01763d905d4ceb7cfd0e6a1746fde7e849a32932642"} err="failed to get container status \"6b129f5ed6bef1fd2d57f01763d905d4ceb7cfd0e6a1746fde7e849a32932642\": rpc error: code = NotFound desc = could not find container \"6b129f5ed6bef1fd2d57f01763d905d4ceb7cfd0e6a1746fde7e849a32932642\": container with ID starting with 6b129f5ed6bef1fd2d57f01763d905d4ceb7cfd0e6a1746fde7e849a32932642 not found: ID does not exist" Mar 18 16:11:01 crc kubenswrapper[4696]: I0318 16:11:01.479111 4696 scope.go:117] "RemoveContainer" containerID="a05d70feae69385aed64cf9538cbc5ba20c36ff65429f7c0ef2eb919ceb5bc17" Mar 18 16:11:01 crc kubenswrapper[4696]: E0318 16:11:01.479465 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a05d70feae69385aed64cf9538cbc5ba20c36ff65429f7c0ef2eb919ceb5bc17\": container with ID starting with a05d70feae69385aed64cf9538cbc5ba20c36ff65429f7c0ef2eb919ceb5bc17 not found: ID does not exist" containerID="a05d70feae69385aed64cf9538cbc5ba20c36ff65429f7c0ef2eb919ceb5bc17" Mar 18 16:11:01 crc kubenswrapper[4696]: I0318 16:11:01.479485 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a05d70feae69385aed64cf9538cbc5ba20c36ff65429f7c0ef2eb919ceb5bc17"} err="failed to get container status \"a05d70feae69385aed64cf9538cbc5ba20c36ff65429f7c0ef2eb919ceb5bc17\": rpc error: code = NotFound desc = could not find container \"a05d70feae69385aed64cf9538cbc5ba20c36ff65429f7c0ef2eb919ceb5bc17\": container with ID starting with a05d70feae69385aed64cf9538cbc5ba20c36ff65429f7c0ef2eb919ceb5bc17 not found: ID does not exist" Mar 18 16:11:01 crc kubenswrapper[4696]: I0318 16:11:01.479497 4696 scope.go:117] "RemoveContainer" containerID="7c3bc34bfb15f69c948d069c4d17877d2d9f69a84712c8d047d2a694b9cf4e25" Mar 18 16:11:01 crc kubenswrapper[4696]: E0318 16:11:01.479904 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c3bc34bfb15f69c948d069c4d17877d2d9f69a84712c8d047d2a694b9cf4e25\": container with ID starting with 7c3bc34bfb15f69c948d069c4d17877d2d9f69a84712c8d047d2a694b9cf4e25 not found: ID does not exist" containerID="7c3bc34bfb15f69c948d069c4d17877d2d9f69a84712c8d047d2a694b9cf4e25" Mar 18 16:11:01 crc kubenswrapper[4696]: I0318 16:11:01.479978 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c3bc34bfb15f69c948d069c4d17877d2d9f69a84712c8d047d2a694b9cf4e25"} err="failed to get container status \"7c3bc34bfb15f69c948d069c4d17877d2d9f69a84712c8d047d2a694b9cf4e25\": rpc error: code = NotFound desc = could not find container \"7c3bc34bfb15f69c948d069c4d17877d2d9f69a84712c8d047d2a694b9cf4e25\": container with ID starting with 7c3bc34bfb15f69c948d069c4d17877d2d9f69a84712c8d047d2a694b9cf4e25 not found: ID does not exist" Mar 18 16:11:01 crc kubenswrapper[4696]: I0318 16:11:01.608380 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cee4651c-7e34-42f9-bb81-9537803fa622" path="/var/lib/kubelet/pods/cee4651c-7e34-42f9-bb81-9537803fa622/volumes" Mar 18 16:11:02 crc kubenswrapper[4696]: I0318 16:11:02.381290 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" event={"ID":"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d","Type":"ContainerStarted","Data":"f67c1f0a8f064b2f0eeaee16a5329afda13ccdfc15962a25cdb1a6c08b977676"} Mar 18 16:11:02 crc kubenswrapper[4696]: I0318 16:11:02.381759 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" event={"ID":"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d","Type":"ContainerStarted","Data":"f2d3b81a52b07c6da935064182510d2fcc1be05698c807f1155fc827676908e4"} Mar 18 16:11:02 crc kubenswrapper[4696]: I0318 16:11:02.406000 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" podStartSLOduration=2.218549211 podStartE2EDuration="2.405980673s" podCreationTimestamp="2026-03-18 16:11:00 +0000 UTC" firstStartedPulling="2026-03-18 16:11:01.437760008 +0000 UTC m=+2104.443934214" lastFinishedPulling="2026-03-18 16:11:01.62519147 +0000 UTC m=+2104.631365676" observedRunningTime="2026-03-18 16:11:02.402898226 +0000 UTC m=+2105.409072452" watchObservedRunningTime="2026-03-18 16:11:02.405980673 +0000 UTC m=+2105.412154869" Mar 18 16:11:12 crc kubenswrapper[4696]: I0318 16:11:12.185128 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:11:12 crc kubenswrapper[4696]: I0318 16:11:12.185739 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:11:22 crc kubenswrapper[4696]: I0318 16:11:22.767277 4696 scope.go:117] "RemoveContainer" containerID="5f7dd1236fe8ba1ffb40e9bd23f1c5b0a1957e06c474823e571e1acc33178c70" Mar 18 16:11:37 crc kubenswrapper[4696]: I0318 16:11:37.721175 4696 generic.go:334] "Generic (PLEG): container finished" podID="f5a70cb2-3b7d-43ab-9ab6-c154a737db7d" containerID="f67c1f0a8f064b2f0eeaee16a5329afda13ccdfc15962a25cdb1a6c08b977676" exitCode=0 Mar 18 16:11:37 crc kubenswrapper[4696]: I0318 16:11:37.721290 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" event={"ID":"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d","Type":"ContainerDied","Data":"f67c1f0a8f064b2f0eeaee16a5329afda13ccdfc15962a25cdb1a6c08b977676"} Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.236375 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.351997 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.352352 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-inventory\") pod \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.352476 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-repo-setup-combined-ca-bundle\") pod \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.352502 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-bootstrap-combined-ca-bundle\") pod \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.352562 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.352637 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-ssh-key-openstack-edpm-ipam\") pod \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.352669 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.352689 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-ovn-combined-ca-bundle\") pod \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.352734 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-libvirt-combined-ca-bundle\") pod \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.352758 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-nova-combined-ca-bundle\") pod \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.352819 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-neutron-metadata-combined-ca-bundle\") pod \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.352845 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drjbc\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-kube-api-access-drjbc\") pod \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.352870 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.352896 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-telemetry-combined-ca-bundle\") pod \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\" (UID: \"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d\") " Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.360249 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d" (UID: "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.360785 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d" (UID: "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.360860 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d" (UID: "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.360946 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d" (UID: "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.361182 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d" (UID: "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.361182 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d" (UID: "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.361316 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d" (UID: "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.362750 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d" (UID: "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.363271 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-kube-api-access-drjbc" (OuterVolumeSpecName: "kube-api-access-drjbc") pod "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d" (UID: "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d"). InnerVolumeSpecName "kube-api-access-drjbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.363353 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d" (UID: "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.363890 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d" (UID: "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.369912 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d" (UID: "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.390350 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d" (UID: "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.394834 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-inventory" (OuterVolumeSpecName: "inventory") pod "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d" (UID: "f5a70cb2-3b7d-43ab-9ab6-c154a737db7d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.455356 4696 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.455463 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.455552 4696 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.455626 4696 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.455894 4696 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.455975 4696 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.456057 4696 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.456416 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drjbc\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-kube-api-access-drjbc\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.456507 4696 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.456633 4696 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.456706 4696 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.456771 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.456833 4696 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.456894 4696 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a70cb2-3b7d-43ab-9ab6-c154a737db7d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.737045 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" event={"ID":"f5a70cb2-3b7d-43ab-9ab6-c154a737db7d","Type":"ContainerDied","Data":"f2d3b81a52b07c6da935064182510d2fcc1be05698c807f1155fc827676908e4"} Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.737085 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2d3b81a52b07c6da935064182510d2fcc1be05698c807f1155fc827676908e4" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.737117 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27gg" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.843509 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h"] Mar 18 16:11:39 crc kubenswrapper[4696]: E0318 16:11:39.843885 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee4651c-7e34-42f9-bb81-9537803fa622" containerName="extract-content" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.843905 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee4651c-7e34-42f9-bb81-9537803fa622" containerName="extract-content" Mar 18 16:11:39 crc kubenswrapper[4696]: E0318 16:11:39.843922 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee4651c-7e34-42f9-bb81-9537803fa622" containerName="extract-utilities" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.843929 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee4651c-7e34-42f9-bb81-9537803fa622" containerName="extract-utilities" Mar 18 16:11:39 crc kubenswrapper[4696]: E0318 16:11:39.843946 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee4651c-7e34-42f9-bb81-9537803fa622" containerName="registry-server" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.843952 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee4651c-7e34-42f9-bb81-9537803fa622" containerName="registry-server" Mar 18 16:11:39 crc kubenswrapper[4696]: E0318 16:11:39.843977 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5a70cb2-3b7d-43ab-9ab6-c154a737db7d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.843985 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5a70cb2-3b7d-43ab-9ab6-c154a737db7d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.844149 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5a70cb2-3b7d-43ab-9ab6-c154a737db7d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.844166 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="cee4651c-7e34-42f9-bb81-9537803fa622" containerName="registry-server" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.844930 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.851335 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.851366 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.851422 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.851452 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g8zsj" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.851360 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.859674 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h"] Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.969606 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gsd7h\" (UID: \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.969690 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gsd7h\" (UID: \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.969814 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmsk8\" (UniqueName: \"kubernetes.io/projected/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-kube-api-access-nmsk8\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gsd7h\" (UID: \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.969952 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gsd7h\" (UID: \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h" Mar 18 16:11:39 crc kubenswrapper[4696]: I0318 16:11:39.970035 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gsd7h\" (UID: \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h" Mar 18 16:11:40 crc kubenswrapper[4696]: I0318 16:11:40.071913 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gsd7h\" (UID: \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h" Mar 18 16:11:40 crc kubenswrapper[4696]: I0318 16:11:40.071959 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gsd7h\" (UID: \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h" Mar 18 16:11:40 crc kubenswrapper[4696]: I0318 16:11:40.071985 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmsk8\" (UniqueName: \"kubernetes.io/projected/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-kube-api-access-nmsk8\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gsd7h\" (UID: \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h" Mar 18 16:11:40 crc kubenswrapper[4696]: I0318 16:11:40.072025 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gsd7h\" (UID: \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h" Mar 18 16:11:40 crc kubenswrapper[4696]: I0318 16:11:40.072046 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gsd7h\" (UID: \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h" Mar 18 16:11:40 crc kubenswrapper[4696]: I0318 16:11:40.073718 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gsd7h\" (UID: \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h" Mar 18 16:11:40 crc kubenswrapper[4696]: I0318 16:11:40.075926 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gsd7h\" (UID: \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h" Mar 18 16:11:40 crc kubenswrapper[4696]: I0318 16:11:40.076044 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gsd7h\" (UID: \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h" Mar 18 16:11:40 crc kubenswrapper[4696]: I0318 16:11:40.080129 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gsd7h\" (UID: \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h" Mar 18 16:11:40 crc kubenswrapper[4696]: I0318 16:11:40.089618 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmsk8\" (UniqueName: \"kubernetes.io/projected/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-kube-api-access-nmsk8\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gsd7h\" (UID: \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h" Mar 18 16:11:40 crc kubenswrapper[4696]: I0318 16:11:40.172205 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h" Mar 18 16:11:40 crc kubenswrapper[4696]: I0318 16:11:40.729044 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h"] Mar 18 16:11:40 crc kubenswrapper[4696]: I0318 16:11:40.753096 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h" event={"ID":"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73","Type":"ContainerStarted","Data":"8df7f5c438e1fc63e8802c9c7541491173301b5e6b5edf7bcc4d3cb2f311c313"} Mar 18 16:11:41 crc kubenswrapper[4696]: I0318 16:11:41.764461 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h" event={"ID":"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73","Type":"ContainerStarted","Data":"f31fc98a1174ca9d76985fadafdcb13ee4b2ff7063fca4519e9b69549f61c9e0"} Mar 18 16:11:41 crc kubenswrapper[4696]: I0318 16:11:41.790920 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h" podStartSLOduration=2.3078475 podStartE2EDuration="2.790901932s" podCreationTimestamp="2026-03-18 16:11:39 +0000 UTC" firstStartedPulling="2026-03-18 16:11:40.738628868 +0000 UTC m=+2143.744803074" lastFinishedPulling="2026-03-18 16:11:41.2216833 +0000 UTC m=+2144.227857506" observedRunningTime="2026-03-18 16:11:41.782017848 +0000 UTC m=+2144.788192054" watchObservedRunningTime="2026-03-18 16:11:41.790901932 +0000 UTC m=+2144.797076138" Mar 18 16:11:42 crc kubenswrapper[4696]: I0318 16:11:42.184824 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:11:42 crc kubenswrapper[4696]: I0318 16:11:42.184886 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:11:42 crc kubenswrapper[4696]: I0318 16:11:42.184936 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 16:11:42 crc kubenswrapper[4696]: I0318 16:11:42.185680 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"418d22c717d701d8c2458f1bdfe29acc3f5066a0a4b0d71d5979be21ee6f29d7"} pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:11:42 crc kubenswrapper[4696]: I0318 16:11:42.185740 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" containerID="cri-o://418d22c717d701d8c2458f1bdfe29acc3f5066a0a4b0d71d5979be21ee6f29d7" gracePeriod=600 Mar 18 16:11:42 crc kubenswrapper[4696]: I0318 16:11:42.776646 4696 generic.go:334] "Generic (PLEG): container finished" podID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerID="418d22c717d701d8c2458f1bdfe29acc3f5066a0a4b0d71d5979be21ee6f29d7" exitCode=0 Mar 18 16:11:42 crc kubenswrapper[4696]: I0318 16:11:42.776748 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerDied","Data":"418d22c717d701d8c2458f1bdfe29acc3f5066a0a4b0d71d5979be21ee6f29d7"} Mar 18 16:11:42 crc kubenswrapper[4696]: I0318 16:11:42.777049 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerStarted","Data":"9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f"} Mar 18 16:11:42 crc kubenswrapper[4696]: I0318 16:11:42.777072 4696 scope.go:117] "RemoveContainer" containerID="aeb24b3b4dea9e3c22d17758f03e64881e3006aa5b568dc58bf6c726a23264a9" Mar 18 16:12:00 crc kubenswrapper[4696]: I0318 16:12:00.134648 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564172-w46vq"] Mar 18 16:12:00 crc kubenswrapper[4696]: I0318 16:12:00.136672 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564172-w46vq" Mar 18 16:12:00 crc kubenswrapper[4696]: I0318 16:12:00.139316 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:12:00 crc kubenswrapper[4696]: I0318 16:12:00.139799 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:12:00 crc kubenswrapper[4696]: I0318 16:12:00.141474 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:12:00 crc kubenswrapper[4696]: I0318 16:12:00.143014 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564172-w46vq"] Mar 18 16:12:00 crc kubenswrapper[4696]: I0318 16:12:00.226919 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpr9w\" (UniqueName: \"kubernetes.io/projected/ec181c54-af7d-4bda-aace-bac85ff76032-kube-api-access-xpr9w\") pod \"auto-csr-approver-29564172-w46vq\" (UID: \"ec181c54-af7d-4bda-aace-bac85ff76032\") " pod="openshift-infra/auto-csr-approver-29564172-w46vq" Mar 18 16:12:00 crc kubenswrapper[4696]: I0318 16:12:00.328645 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpr9w\" (UniqueName: \"kubernetes.io/projected/ec181c54-af7d-4bda-aace-bac85ff76032-kube-api-access-xpr9w\") pod \"auto-csr-approver-29564172-w46vq\" (UID: \"ec181c54-af7d-4bda-aace-bac85ff76032\") " pod="openshift-infra/auto-csr-approver-29564172-w46vq" Mar 18 16:12:00 crc kubenswrapper[4696]: I0318 16:12:00.348809 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpr9w\" (UniqueName: \"kubernetes.io/projected/ec181c54-af7d-4bda-aace-bac85ff76032-kube-api-access-xpr9w\") pod \"auto-csr-approver-29564172-w46vq\" (UID: \"ec181c54-af7d-4bda-aace-bac85ff76032\") " pod="openshift-infra/auto-csr-approver-29564172-w46vq" Mar 18 16:12:00 crc kubenswrapper[4696]: I0318 16:12:00.457735 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564172-w46vq" Mar 18 16:12:00 crc kubenswrapper[4696]: I0318 16:12:00.911555 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564172-w46vq"] Mar 18 16:12:00 crc kubenswrapper[4696]: W0318 16:12:00.916366 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec181c54_af7d_4bda_aace_bac85ff76032.slice/crio-98e4ed53f1ef77ce75f34db427dc139c468749328b0ab8ab6947d3d31eee6394 WatchSource:0}: Error finding container 98e4ed53f1ef77ce75f34db427dc139c468749328b0ab8ab6947d3d31eee6394: Status 404 returned error can't find the container with id 98e4ed53f1ef77ce75f34db427dc139c468749328b0ab8ab6947d3d31eee6394 Mar 18 16:12:00 crc kubenswrapper[4696]: I0318 16:12:00.942255 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564172-w46vq" event={"ID":"ec181c54-af7d-4bda-aace-bac85ff76032","Type":"ContainerStarted","Data":"98e4ed53f1ef77ce75f34db427dc139c468749328b0ab8ab6947d3d31eee6394"} Mar 18 16:12:03 crc kubenswrapper[4696]: I0318 16:12:03.969032 4696 generic.go:334] "Generic (PLEG): container finished" podID="ec181c54-af7d-4bda-aace-bac85ff76032" containerID="e4cc71585d860b3d30e89080d0040f9fe626cb91f87f9e375272bcfd3c400bad" exitCode=0 Mar 18 16:12:03 crc kubenswrapper[4696]: I0318 16:12:03.969154 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564172-w46vq" event={"ID":"ec181c54-af7d-4bda-aace-bac85ff76032","Type":"ContainerDied","Data":"e4cc71585d860b3d30e89080d0040f9fe626cb91f87f9e375272bcfd3c400bad"} Mar 18 16:12:05 crc kubenswrapper[4696]: I0318 16:12:05.358020 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564172-w46vq" Mar 18 16:12:05 crc kubenswrapper[4696]: I0318 16:12:05.532956 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpr9w\" (UniqueName: \"kubernetes.io/projected/ec181c54-af7d-4bda-aace-bac85ff76032-kube-api-access-xpr9w\") pod \"ec181c54-af7d-4bda-aace-bac85ff76032\" (UID: \"ec181c54-af7d-4bda-aace-bac85ff76032\") " Mar 18 16:12:05 crc kubenswrapper[4696]: I0318 16:12:05.543683 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec181c54-af7d-4bda-aace-bac85ff76032-kube-api-access-xpr9w" (OuterVolumeSpecName: "kube-api-access-xpr9w") pod "ec181c54-af7d-4bda-aace-bac85ff76032" (UID: "ec181c54-af7d-4bda-aace-bac85ff76032"). InnerVolumeSpecName "kube-api-access-xpr9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:12:05 crc kubenswrapper[4696]: I0318 16:12:05.634977 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpr9w\" (UniqueName: \"kubernetes.io/projected/ec181c54-af7d-4bda-aace-bac85ff76032-kube-api-access-xpr9w\") on node \"crc\" DevicePath \"\"" Mar 18 16:12:05 crc kubenswrapper[4696]: I0318 16:12:05.986259 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564172-w46vq" event={"ID":"ec181c54-af7d-4bda-aace-bac85ff76032","Type":"ContainerDied","Data":"98e4ed53f1ef77ce75f34db427dc139c468749328b0ab8ab6947d3d31eee6394"} Mar 18 16:12:05 crc kubenswrapper[4696]: I0318 16:12:05.986298 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98e4ed53f1ef77ce75f34db427dc139c468749328b0ab8ab6947d3d31eee6394" Mar 18 16:12:05 crc kubenswrapper[4696]: I0318 16:12:05.986355 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564172-w46vq" Mar 18 16:12:06 crc kubenswrapper[4696]: I0318 16:12:06.430960 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564166-dwq7v"] Mar 18 16:12:06 crc kubenswrapper[4696]: I0318 16:12:06.439237 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564166-dwq7v"] Mar 18 16:12:07 crc kubenswrapper[4696]: I0318 16:12:07.613071 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faafbd48-4d87-4569-ade7-f00d32a4c6a2" path="/var/lib/kubelet/pods/faafbd48-4d87-4569-ade7-f00d32a4c6a2/volumes" Mar 18 16:12:22 crc kubenswrapper[4696]: I0318 16:12:22.851681 4696 scope.go:117] "RemoveContainer" containerID="4237901eadb94cb2b952bf5b0f3902642c324f0936efaecf4454130e7eb0819c" Mar 18 16:12:39 crc kubenswrapper[4696]: I0318 16:12:39.887063 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-92djs"] Mar 18 16:12:39 crc kubenswrapper[4696]: E0318 16:12:39.889343 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec181c54-af7d-4bda-aace-bac85ff76032" containerName="oc" Mar 18 16:12:39 crc kubenswrapper[4696]: I0318 16:12:39.889368 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec181c54-af7d-4bda-aace-bac85ff76032" containerName="oc" Mar 18 16:12:39 crc kubenswrapper[4696]: I0318 16:12:39.889578 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec181c54-af7d-4bda-aace-bac85ff76032" containerName="oc" Mar 18 16:12:39 crc kubenswrapper[4696]: I0318 16:12:39.891001 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-92djs" Mar 18 16:12:39 crc kubenswrapper[4696]: I0318 16:12:39.896645 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-92djs"] Mar 18 16:12:40 crc kubenswrapper[4696]: I0318 16:12:40.042899 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23de02e1-77ae-45d8-a49f-9bea005f18d4-utilities\") pod \"community-operators-92djs\" (UID: \"23de02e1-77ae-45d8-a49f-9bea005f18d4\") " pod="openshift-marketplace/community-operators-92djs" Mar 18 16:12:40 crc kubenswrapper[4696]: I0318 16:12:40.043032 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8gcx\" (UniqueName: \"kubernetes.io/projected/23de02e1-77ae-45d8-a49f-9bea005f18d4-kube-api-access-j8gcx\") pod \"community-operators-92djs\" (UID: \"23de02e1-77ae-45d8-a49f-9bea005f18d4\") " pod="openshift-marketplace/community-operators-92djs" Mar 18 16:12:40 crc kubenswrapper[4696]: I0318 16:12:40.043154 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23de02e1-77ae-45d8-a49f-9bea005f18d4-catalog-content\") pod \"community-operators-92djs\" (UID: \"23de02e1-77ae-45d8-a49f-9bea005f18d4\") " pod="openshift-marketplace/community-operators-92djs" Mar 18 16:12:40 crc kubenswrapper[4696]: I0318 16:12:40.145169 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8gcx\" (UniqueName: \"kubernetes.io/projected/23de02e1-77ae-45d8-a49f-9bea005f18d4-kube-api-access-j8gcx\") pod \"community-operators-92djs\" (UID: \"23de02e1-77ae-45d8-a49f-9bea005f18d4\") " pod="openshift-marketplace/community-operators-92djs" Mar 18 16:12:40 crc kubenswrapper[4696]: I0318 16:12:40.145335 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23de02e1-77ae-45d8-a49f-9bea005f18d4-catalog-content\") pod \"community-operators-92djs\" (UID: \"23de02e1-77ae-45d8-a49f-9bea005f18d4\") " pod="openshift-marketplace/community-operators-92djs" Mar 18 16:12:40 crc kubenswrapper[4696]: I0318 16:12:40.145401 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23de02e1-77ae-45d8-a49f-9bea005f18d4-utilities\") pod \"community-operators-92djs\" (UID: \"23de02e1-77ae-45d8-a49f-9bea005f18d4\") " pod="openshift-marketplace/community-operators-92djs" Mar 18 16:12:40 crc kubenswrapper[4696]: I0318 16:12:40.145984 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23de02e1-77ae-45d8-a49f-9bea005f18d4-catalog-content\") pod \"community-operators-92djs\" (UID: \"23de02e1-77ae-45d8-a49f-9bea005f18d4\") " pod="openshift-marketplace/community-operators-92djs" Mar 18 16:12:40 crc kubenswrapper[4696]: I0318 16:12:40.146028 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23de02e1-77ae-45d8-a49f-9bea005f18d4-utilities\") pod \"community-operators-92djs\" (UID: \"23de02e1-77ae-45d8-a49f-9bea005f18d4\") " pod="openshift-marketplace/community-operators-92djs" Mar 18 16:12:40 crc kubenswrapper[4696]: I0318 16:12:40.164714 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8gcx\" (UniqueName: \"kubernetes.io/projected/23de02e1-77ae-45d8-a49f-9bea005f18d4-kube-api-access-j8gcx\") pod \"community-operators-92djs\" (UID: \"23de02e1-77ae-45d8-a49f-9bea005f18d4\") " pod="openshift-marketplace/community-operators-92djs" Mar 18 16:12:40 crc kubenswrapper[4696]: I0318 16:12:40.223266 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-92djs" Mar 18 16:12:40 crc kubenswrapper[4696]: I0318 16:12:40.759313 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-92djs"] Mar 18 16:12:41 crc kubenswrapper[4696]: I0318 16:12:41.283832 4696 generic.go:334] "Generic (PLEG): container finished" podID="23de02e1-77ae-45d8-a49f-9bea005f18d4" containerID="608d0a4e4410a6752d469c9b871c97d88afe6fa0ef92cebfe431a008894ef32d" exitCode=0 Mar 18 16:12:41 crc kubenswrapper[4696]: I0318 16:12:41.283890 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-92djs" event={"ID":"23de02e1-77ae-45d8-a49f-9bea005f18d4","Type":"ContainerDied","Data":"608d0a4e4410a6752d469c9b871c97d88afe6fa0ef92cebfe431a008894ef32d"} Mar 18 16:12:41 crc kubenswrapper[4696]: I0318 16:12:41.283932 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-92djs" event={"ID":"23de02e1-77ae-45d8-a49f-9bea005f18d4","Type":"ContainerStarted","Data":"394243e219f3337ed7a64f53f5b697162b357ae4e6fd1dbaf180faa36e171930"} Mar 18 16:12:42 crc kubenswrapper[4696]: I0318 16:12:42.298130 4696 generic.go:334] "Generic (PLEG): container finished" podID="8ac2ae34-5ffd-4557-96ae-c4d268e2cf73" containerID="f31fc98a1174ca9d76985fadafdcb13ee4b2ff7063fca4519e9b69549f61c9e0" exitCode=0 Mar 18 16:12:42 crc kubenswrapper[4696]: I0318 16:12:42.298266 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h" event={"ID":"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73","Type":"ContainerDied","Data":"f31fc98a1174ca9d76985fadafdcb13ee4b2ff7063fca4519e9b69549f61c9e0"} Mar 18 16:12:43 crc kubenswrapper[4696]: I0318 16:12:43.312390 4696 generic.go:334] "Generic (PLEG): container finished" podID="23de02e1-77ae-45d8-a49f-9bea005f18d4" containerID="82666120eab41a5c0b19d479cd61a8c537160a0ec0ae35f4fcd3adedb1dd49da" exitCode=0 Mar 18 16:12:43 crc kubenswrapper[4696]: I0318 16:12:43.312474 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-92djs" event={"ID":"23de02e1-77ae-45d8-a49f-9bea005f18d4","Type":"ContainerDied","Data":"82666120eab41a5c0b19d479cd61a8c537160a0ec0ae35f4fcd3adedb1dd49da"} Mar 18 16:12:43 crc kubenswrapper[4696]: I0318 16:12:43.723142 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h" Mar 18 16:12:43 crc kubenswrapper[4696]: I0318 16:12:43.823949 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-ovncontroller-config-0\") pod \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\" (UID: \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\") " Mar 18 16:12:43 crc kubenswrapper[4696]: I0318 16:12:43.824117 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-ssh-key-openstack-edpm-ipam\") pod \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\" (UID: \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\") " Mar 18 16:12:43 crc kubenswrapper[4696]: I0318 16:12:43.824187 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-ovn-combined-ca-bundle\") pod \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\" (UID: \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\") " Mar 18 16:12:43 crc kubenswrapper[4696]: I0318 16:12:43.824232 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmsk8\" (UniqueName: \"kubernetes.io/projected/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-kube-api-access-nmsk8\") pod \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\" (UID: \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\") " Mar 18 16:12:43 crc kubenswrapper[4696]: I0318 16:12:43.824269 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-inventory\") pod \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\" (UID: \"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73\") " Mar 18 16:12:43 crc kubenswrapper[4696]: I0318 16:12:43.830801 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-kube-api-access-nmsk8" (OuterVolumeSpecName: "kube-api-access-nmsk8") pod "8ac2ae34-5ffd-4557-96ae-c4d268e2cf73" (UID: "8ac2ae34-5ffd-4557-96ae-c4d268e2cf73"). InnerVolumeSpecName "kube-api-access-nmsk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:12:43 crc kubenswrapper[4696]: I0318 16:12:43.831288 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8ac2ae34-5ffd-4557-96ae-c4d268e2cf73" (UID: "8ac2ae34-5ffd-4557-96ae-c4d268e2cf73"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:12:43 crc kubenswrapper[4696]: I0318 16:12:43.850022 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-inventory" (OuterVolumeSpecName: "inventory") pod "8ac2ae34-5ffd-4557-96ae-c4d268e2cf73" (UID: "8ac2ae34-5ffd-4557-96ae-c4d268e2cf73"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:12:43 crc kubenswrapper[4696]: I0318 16:12:43.851614 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8ac2ae34-5ffd-4557-96ae-c4d268e2cf73" (UID: "8ac2ae34-5ffd-4557-96ae-c4d268e2cf73"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:12:43 crc kubenswrapper[4696]: I0318 16:12:43.854026 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "8ac2ae34-5ffd-4557-96ae-c4d268e2cf73" (UID: "8ac2ae34-5ffd-4557-96ae-c4d268e2cf73"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:12:43 crc kubenswrapper[4696]: I0318 16:12:43.927090 4696 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:12:43 crc kubenswrapper[4696]: I0318 16:12:43.927122 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:12:43 crc kubenswrapper[4696]: I0318 16:12:43.927133 4696 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:12:43 crc kubenswrapper[4696]: I0318 16:12:43.927192 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmsk8\" (UniqueName: \"kubernetes.io/projected/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-kube-api-access-nmsk8\") on node \"crc\" DevicePath \"\"" Mar 18 16:12:43 crc kubenswrapper[4696]: I0318 16:12:43.927203 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ac2ae34-5ffd-4557-96ae-c4d268e2cf73-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.321610 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h" event={"ID":"8ac2ae34-5ffd-4557-96ae-c4d268e2cf73","Type":"ContainerDied","Data":"8df7f5c438e1fc63e8802c9c7541491173301b5e6b5edf7bcc4d3cb2f311c313"} Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.321656 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8df7f5c438e1fc63e8802c9c7541491173301b5e6b5edf7bcc4d3cb2f311c313" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.321705 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gsd7h" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.417227 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7"] Mar 18 16:12:44 crc kubenswrapper[4696]: E0318 16:12:44.417800 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ac2ae34-5ffd-4557-96ae-c4d268e2cf73" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.417828 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ac2ae34-5ffd-4557-96ae-c4d268e2cf73" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.418055 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ac2ae34-5ffd-4557-96ae-c4d268e2cf73" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.420100 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.439690 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.439769 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.440006 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.440051 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.440062 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.440094 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g8zsj" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.457883 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7"] Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.543102 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7\" (UID: \"a50d071c-8a54-4335-be8c-1842e52dcb81\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.543197 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7\" (UID: \"a50d071c-8a54-4335-be8c-1842e52dcb81\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.543248 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7\" (UID: \"a50d071c-8a54-4335-be8c-1842e52dcb81\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.543278 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7\" (UID: \"a50d071c-8a54-4335-be8c-1842e52dcb81\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.543313 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7\" (UID: \"a50d071c-8a54-4335-be8c-1842e52dcb81\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.543408 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7zc6\" (UniqueName: \"kubernetes.io/projected/a50d071c-8a54-4335-be8c-1842e52dcb81-kube-api-access-m7zc6\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7\" (UID: \"a50d071c-8a54-4335-be8c-1842e52dcb81\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.645171 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7zc6\" (UniqueName: \"kubernetes.io/projected/a50d071c-8a54-4335-be8c-1842e52dcb81-kube-api-access-m7zc6\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7\" (UID: \"a50d071c-8a54-4335-be8c-1842e52dcb81\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.645289 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7\" (UID: \"a50d071c-8a54-4335-be8c-1842e52dcb81\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.645339 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7\" (UID: \"a50d071c-8a54-4335-be8c-1842e52dcb81\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.645378 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7\" (UID: \"a50d071c-8a54-4335-be8c-1842e52dcb81\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.645413 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7\" (UID: \"a50d071c-8a54-4335-be8c-1842e52dcb81\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.645444 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7\" (UID: \"a50d071c-8a54-4335-be8c-1842e52dcb81\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.650291 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7\" (UID: \"a50d071c-8a54-4335-be8c-1842e52dcb81\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.650567 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7\" (UID: \"a50d071c-8a54-4335-be8c-1842e52dcb81\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.650567 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7\" (UID: \"a50d071c-8a54-4335-be8c-1842e52dcb81\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.652606 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7\" (UID: \"a50d071c-8a54-4335-be8c-1842e52dcb81\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.656936 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7\" (UID: \"a50d071c-8a54-4335-be8c-1842e52dcb81\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.671113 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7zc6\" (UniqueName: \"kubernetes.io/projected/a50d071c-8a54-4335-be8c-1842e52dcb81-kube-api-access-m7zc6\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7\" (UID: \"a50d071c-8a54-4335-be8c-1842e52dcb81\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" Mar 18 16:12:44 crc kubenswrapper[4696]: I0318 16:12:44.760064 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" Mar 18 16:12:45 crc kubenswrapper[4696]: I0318 16:12:45.319739 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7"] Mar 18 16:12:45 crc kubenswrapper[4696]: I0318 16:12:45.334922 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-92djs" event={"ID":"23de02e1-77ae-45d8-a49f-9bea005f18d4","Type":"ContainerStarted","Data":"a6a2d77101d225b9ecb652006d9afd469eff871bbc2aefc77206ae696db31ffd"} Mar 18 16:12:45 crc kubenswrapper[4696]: W0318 16:12:45.370705 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda50d071c_8a54_4335_be8c_1842e52dcb81.slice/crio-cdbbcff0321370ef4e775b27b87a5d26c8bfa1f3c87c957bbd851dd346635bc4 WatchSource:0}: Error finding container cdbbcff0321370ef4e775b27b87a5d26c8bfa1f3c87c957bbd851dd346635bc4: Status 404 returned error can't find the container with id cdbbcff0321370ef4e775b27b87a5d26c8bfa1f3c87c957bbd851dd346635bc4 Mar 18 16:12:45 crc kubenswrapper[4696]: I0318 16:12:45.389581 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-92djs" podStartSLOduration=3.278222583 podStartE2EDuration="6.389466464s" podCreationTimestamp="2026-03-18 16:12:39 +0000 UTC" firstStartedPulling="2026-03-18 16:12:41.285985529 +0000 UTC m=+2204.292159735" lastFinishedPulling="2026-03-18 16:12:44.39722941 +0000 UTC m=+2207.403403616" observedRunningTime="2026-03-18 16:12:45.37894613 +0000 UTC m=+2208.385120336" watchObservedRunningTime="2026-03-18 16:12:45.389466464 +0000 UTC m=+2208.395640690" Mar 18 16:12:46 crc kubenswrapper[4696]: I0318 16:12:46.345941 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" event={"ID":"a50d071c-8a54-4335-be8c-1842e52dcb81","Type":"ContainerStarted","Data":"759d9320d255f90581554f0ca43265f51439dddcbec5c9de181b8c0c6fbbe863"} Mar 18 16:12:46 crc kubenswrapper[4696]: I0318 16:12:46.346265 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" event={"ID":"a50d071c-8a54-4335-be8c-1842e52dcb81","Type":"ContainerStarted","Data":"cdbbcff0321370ef4e775b27b87a5d26c8bfa1f3c87c957bbd851dd346635bc4"} Mar 18 16:12:50 crc kubenswrapper[4696]: I0318 16:12:50.223781 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-92djs" Mar 18 16:12:50 crc kubenswrapper[4696]: I0318 16:12:50.225688 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-92djs" Mar 18 16:12:50 crc kubenswrapper[4696]: I0318 16:12:50.290700 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-92djs" Mar 18 16:12:50 crc kubenswrapper[4696]: I0318 16:12:50.314540 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" podStartSLOduration=5.991714814 podStartE2EDuration="6.314508066s" podCreationTimestamp="2026-03-18 16:12:44 +0000 UTC" firstStartedPulling="2026-03-18 16:12:45.37378777 +0000 UTC m=+2208.379961996" lastFinishedPulling="2026-03-18 16:12:45.696581042 +0000 UTC m=+2208.702755248" observedRunningTime="2026-03-18 16:12:46.370463915 +0000 UTC m=+2209.376638121" watchObservedRunningTime="2026-03-18 16:12:50.314508066 +0000 UTC m=+2213.320682272" Mar 18 16:12:50 crc kubenswrapper[4696]: I0318 16:12:50.421438 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-92djs" Mar 18 16:12:50 crc kubenswrapper[4696]: I0318 16:12:50.532440 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-92djs"] Mar 18 16:12:52 crc kubenswrapper[4696]: I0318 16:12:52.394054 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-92djs" podUID="23de02e1-77ae-45d8-a49f-9bea005f18d4" containerName="registry-server" containerID="cri-o://a6a2d77101d225b9ecb652006d9afd469eff871bbc2aefc77206ae696db31ffd" gracePeriod=2 Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.011868 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-92djs" Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.110730 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8gcx\" (UniqueName: \"kubernetes.io/projected/23de02e1-77ae-45d8-a49f-9bea005f18d4-kube-api-access-j8gcx\") pod \"23de02e1-77ae-45d8-a49f-9bea005f18d4\" (UID: \"23de02e1-77ae-45d8-a49f-9bea005f18d4\") " Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.110828 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23de02e1-77ae-45d8-a49f-9bea005f18d4-utilities\") pod \"23de02e1-77ae-45d8-a49f-9bea005f18d4\" (UID: \"23de02e1-77ae-45d8-a49f-9bea005f18d4\") " Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.110900 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23de02e1-77ae-45d8-a49f-9bea005f18d4-catalog-content\") pod \"23de02e1-77ae-45d8-a49f-9bea005f18d4\" (UID: \"23de02e1-77ae-45d8-a49f-9bea005f18d4\") " Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.111809 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23de02e1-77ae-45d8-a49f-9bea005f18d4-utilities" (OuterVolumeSpecName: "utilities") pod "23de02e1-77ae-45d8-a49f-9bea005f18d4" (UID: "23de02e1-77ae-45d8-a49f-9bea005f18d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.118778 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23de02e1-77ae-45d8-a49f-9bea005f18d4-kube-api-access-j8gcx" (OuterVolumeSpecName: "kube-api-access-j8gcx") pod "23de02e1-77ae-45d8-a49f-9bea005f18d4" (UID: "23de02e1-77ae-45d8-a49f-9bea005f18d4"). InnerVolumeSpecName "kube-api-access-j8gcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.177748 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23de02e1-77ae-45d8-a49f-9bea005f18d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23de02e1-77ae-45d8-a49f-9bea005f18d4" (UID: "23de02e1-77ae-45d8-a49f-9bea005f18d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.214391 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8gcx\" (UniqueName: \"kubernetes.io/projected/23de02e1-77ae-45d8-a49f-9bea005f18d4-kube-api-access-j8gcx\") on node \"crc\" DevicePath \"\"" Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.214424 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23de02e1-77ae-45d8-a49f-9bea005f18d4-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.214434 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23de02e1-77ae-45d8-a49f-9bea005f18d4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.404212 4696 generic.go:334] "Generic (PLEG): container finished" podID="23de02e1-77ae-45d8-a49f-9bea005f18d4" containerID="a6a2d77101d225b9ecb652006d9afd469eff871bbc2aefc77206ae696db31ffd" exitCode=0 Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.404260 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-92djs" event={"ID":"23de02e1-77ae-45d8-a49f-9bea005f18d4","Type":"ContainerDied","Data":"a6a2d77101d225b9ecb652006d9afd469eff871bbc2aefc77206ae696db31ffd"} Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.404283 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-92djs" Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.404296 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-92djs" event={"ID":"23de02e1-77ae-45d8-a49f-9bea005f18d4","Type":"ContainerDied","Data":"394243e219f3337ed7a64f53f5b697162b357ae4e6fd1dbaf180faa36e171930"} Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.404318 4696 scope.go:117] "RemoveContainer" containerID="a6a2d77101d225b9ecb652006d9afd469eff871bbc2aefc77206ae696db31ffd" Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.436617 4696 scope.go:117] "RemoveContainer" containerID="82666120eab41a5c0b19d479cd61a8c537160a0ec0ae35f4fcd3adedb1dd49da" Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.453179 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-92djs"] Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.460849 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-92djs"] Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.466114 4696 scope.go:117] "RemoveContainer" containerID="608d0a4e4410a6752d469c9b871c97d88afe6fa0ef92cebfe431a008894ef32d" Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.508656 4696 scope.go:117] "RemoveContainer" containerID="a6a2d77101d225b9ecb652006d9afd469eff871bbc2aefc77206ae696db31ffd" Mar 18 16:12:53 crc kubenswrapper[4696]: E0318 16:12:53.509162 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6a2d77101d225b9ecb652006d9afd469eff871bbc2aefc77206ae696db31ffd\": container with ID starting with a6a2d77101d225b9ecb652006d9afd469eff871bbc2aefc77206ae696db31ffd not found: ID does not exist" containerID="a6a2d77101d225b9ecb652006d9afd469eff871bbc2aefc77206ae696db31ffd" Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.509213 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6a2d77101d225b9ecb652006d9afd469eff871bbc2aefc77206ae696db31ffd"} err="failed to get container status \"a6a2d77101d225b9ecb652006d9afd469eff871bbc2aefc77206ae696db31ffd\": rpc error: code = NotFound desc = could not find container \"a6a2d77101d225b9ecb652006d9afd469eff871bbc2aefc77206ae696db31ffd\": container with ID starting with a6a2d77101d225b9ecb652006d9afd469eff871bbc2aefc77206ae696db31ffd not found: ID does not exist" Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.509247 4696 scope.go:117] "RemoveContainer" containerID="82666120eab41a5c0b19d479cd61a8c537160a0ec0ae35f4fcd3adedb1dd49da" Mar 18 16:12:53 crc kubenswrapper[4696]: E0318 16:12:53.509806 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82666120eab41a5c0b19d479cd61a8c537160a0ec0ae35f4fcd3adedb1dd49da\": container with ID starting with 82666120eab41a5c0b19d479cd61a8c537160a0ec0ae35f4fcd3adedb1dd49da not found: ID does not exist" containerID="82666120eab41a5c0b19d479cd61a8c537160a0ec0ae35f4fcd3adedb1dd49da" Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.509862 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82666120eab41a5c0b19d479cd61a8c537160a0ec0ae35f4fcd3adedb1dd49da"} err="failed to get container status \"82666120eab41a5c0b19d479cd61a8c537160a0ec0ae35f4fcd3adedb1dd49da\": rpc error: code = NotFound desc = could not find container \"82666120eab41a5c0b19d479cd61a8c537160a0ec0ae35f4fcd3adedb1dd49da\": container with ID starting with 82666120eab41a5c0b19d479cd61a8c537160a0ec0ae35f4fcd3adedb1dd49da not found: ID does not exist" Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.509895 4696 scope.go:117] "RemoveContainer" containerID="608d0a4e4410a6752d469c9b871c97d88afe6fa0ef92cebfe431a008894ef32d" Mar 18 16:12:53 crc kubenswrapper[4696]: E0318 16:12:53.510265 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"608d0a4e4410a6752d469c9b871c97d88afe6fa0ef92cebfe431a008894ef32d\": container with ID starting with 608d0a4e4410a6752d469c9b871c97d88afe6fa0ef92cebfe431a008894ef32d not found: ID does not exist" containerID="608d0a4e4410a6752d469c9b871c97d88afe6fa0ef92cebfe431a008894ef32d" Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.510294 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"608d0a4e4410a6752d469c9b871c97d88afe6fa0ef92cebfe431a008894ef32d"} err="failed to get container status \"608d0a4e4410a6752d469c9b871c97d88afe6fa0ef92cebfe431a008894ef32d\": rpc error: code = NotFound desc = could not find container \"608d0a4e4410a6752d469c9b871c97d88afe6fa0ef92cebfe431a008894ef32d\": container with ID starting with 608d0a4e4410a6752d469c9b871c97d88afe6fa0ef92cebfe431a008894ef32d not found: ID does not exist" Mar 18 16:12:53 crc kubenswrapper[4696]: I0318 16:12:53.608330 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23de02e1-77ae-45d8-a49f-9bea005f18d4" path="/var/lib/kubelet/pods/23de02e1-77ae-45d8-a49f-9bea005f18d4/volumes" Mar 18 16:13:03 crc kubenswrapper[4696]: I0318 16:13:03.828274 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b5lhr"] Mar 18 16:13:03 crc kubenswrapper[4696]: E0318 16:13:03.829137 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23de02e1-77ae-45d8-a49f-9bea005f18d4" containerName="extract-utilities" Mar 18 16:13:03 crc kubenswrapper[4696]: I0318 16:13:03.829152 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="23de02e1-77ae-45d8-a49f-9bea005f18d4" containerName="extract-utilities" Mar 18 16:13:03 crc kubenswrapper[4696]: E0318 16:13:03.829178 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23de02e1-77ae-45d8-a49f-9bea005f18d4" containerName="registry-server" Mar 18 16:13:03 crc kubenswrapper[4696]: I0318 16:13:03.829185 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="23de02e1-77ae-45d8-a49f-9bea005f18d4" containerName="registry-server" Mar 18 16:13:03 crc kubenswrapper[4696]: E0318 16:13:03.829209 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23de02e1-77ae-45d8-a49f-9bea005f18d4" containerName="extract-content" Mar 18 16:13:03 crc kubenswrapper[4696]: I0318 16:13:03.829219 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="23de02e1-77ae-45d8-a49f-9bea005f18d4" containerName="extract-content" Mar 18 16:13:03 crc kubenswrapper[4696]: I0318 16:13:03.829457 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="23de02e1-77ae-45d8-a49f-9bea005f18d4" containerName="registry-server" Mar 18 16:13:03 crc kubenswrapper[4696]: I0318 16:13:03.830953 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b5lhr" Mar 18 16:13:03 crc kubenswrapper[4696]: I0318 16:13:03.848972 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b5lhr"] Mar 18 16:13:03 crc kubenswrapper[4696]: I0318 16:13:03.913767 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hpkv\" (UniqueName: \"kubernetes.io/projected/bd29f121-6058-48a2-8468-99f6e38dc386-kube-api-access-2hpkv\") pod \"certified-operators-b5lhr\" (UID: \"bd29f121-6058-48a2-8468-99f6e38dc386\") " pod="openshift-marketplace/certified-operators-b5lhr" Mar 18 16:13:03 crc kubenswrapper[4696]: I0318 16:13:03.913843 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd29f121-6058-48a2-8468-99f6e38dc386-catalog-content\") pod \"certified-operators-b5lhr\" (UID: \"bd29f121-6058-48a2-8468-99f6e38dc386\") " pod="openshift-marketplace/certified-operators-b5lhr" Mar 18 16:13:03 crc kubenswrapper[4696]: I0318 16:13:03.914051 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd29f121-6058-48a2-8468-99f6e38dc386-utilities\") pod \"certified-operators-b5lhr\" (UID: \"bd29f121-6058-48a2-8468-99f6e38dc386\") " pod="openshift-marketplace/certified-operators-b5lhr" Mar 18 16:13:04 crc kubenswrapper[4696]: I0318 16:13:04.015988 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hpkv\" (UniqueName: \"kubernetes.io/projected/bd29f121-6058-48a2-8468-99f6e38dc386-kube-api-access-2hpkv\") pod \"certified-operators-b5lhr\" (UID: \"bd29f121-6058-48a2-8468-99f6e38dc386\") " pod="openshift-marketplace/certified-operators-b5lhr" Mar 18 16:13:04 crc kubenswrapper[4696]: I0318 16:13:04.016066 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd29f121-6058-48a2-8468-99f6e38dc386-catalog-content\") pod \"certified-operators-b5lhr\" (UID: \"bd29f121-6058-48a2-8468-99f6e38dc386\") " pod="openshift-marketplace/certified-operators-b5lhr" Mar 18 16:13:04 crc kubenswrapper[4696]: I0318 16:13:04.016142 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd29f121-6058-48a2-8468-99f6e38dc386-utilities\") pod \"certified-operators-b5lhr\" (UID: \"bd29f121-6058-48a2-8468-99f6e38dc386\") " pod="openshift-marketplace/certified-operators-b5lhr" Mar 18 16:13:04 crc kubenswrapper[4696]: I0318 16:13:04.016643 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd29f121-6058-48a2-8468-99f6e38dc386-utilities\") pod \"certified-operators-b5lhr\" (UID: \"bd29f121-6058-48a2-8468-99f6e38dc386\") " pod="openshift-marketplace/certified-operators-b5lhr" Mar 18 16:13:04 crc kubenswrapper[4696]: I0318 16:13:04.016699 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd29f121-6058-48a2-8468-99f6e38dc386-catalog-content\") pod \"certified-operators-b5lhr\" (UID: \"bd29f121-6058-48a2-8468-99f6e38dc386\") " pod="openshift-marketplace/certified-operators-b5lhr" Mar 18 16:13:04 crc kubenswrapper[4696]: I0318 16:13:04.035196 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hpkv\" (UniqueName: \"kubernetes.io/projected/bd29f121-6058-48a2-8468-99f6e38dc386-kube-api-access-2hpkv\") pod \"certified-operators-b5lhr\" (UID: \"bd29f121-6058-48a2-8468-99f6e38dc386\") " pod="openshift-marketplace/certified-operators-b5lhr" Mar 18 16:13:04 crc kubenswrapper[4696]: I0318 16:13:04.156723 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b5lhr" Mar 18 16:13:04 crc kubenswrapper[4696]: I0318 16:13:04.636321 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b5lhr"] Mar 18 16:13:05 crc kubenswrapper[4696]: I0318 16:13:05.515110 4696 generic.go:334] "Generic (PLEG): container finished" podID="bd29f121-6058-48a2-8468-99f6e38dc386" containerID="a683b43e56180e17851947d754e6fb4fab869ad63654ed88a53a7c137d3f37f4" exitCode=0 Mar 18 16:13:05 crc kubenswrapper[4696]: I0318 16:13:05.515214 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5lhr" event={"ID":"bd29f121-6058-48a2-8468-99f6e38dc386","Type":"ContainerDied","Data":"a683b43e56180e17851947d754e6fb4fab869ad63654ed88a53a7c137d3f37f4"} Mar 18 16:13:05 crc kubenswrapper[4696]: I0318 16:13:05.515497 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5lhr" event={"ID":"bd29f121-6058-48a2-8468-99f6e38dc386","Type":"ContainerStarted","Data":"625d0d87dc630b5a28c92f1ee1abdcf56327be9d02f382d22bc26acda10a5482"} Mar 18 16:13:07 crc kubenswrapper[4696]: I0318 16:13:07.533343 4696 generic.go:334] "Generic (PLEG): container finished" podID="bd29f121-6058-48a2-8468-99f6e38dc386" containerID="7b1e000c7f6de678a23ac3c3392ce89767bedde70be461f8f4ba2431d35a617d" exitCode=0 Mar 18 16:13:07 crc kubenswrapper[4696]: I0318 16:13:07.533400 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5lhr" event={"ID":"bd29f121-6058-48a2-8468-99f6e38dc386","Type":"ContainerDied","Data":"7b1e000c7f6de678a23ac3c3392ce89767bedde70be461f8f4ba2431d35a617d"} Mar 18 16:13:10 crc kubenswrapper[4696]: I0318 16:13:10.576789 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5lhr" event={"ID":"bd29f121-6058-48a2-8468-99f6e38dc386","Type":"ContainerStarted","Data":"1bc353d1587653633b5e44cbe0a494b36a40020e0b1d1dbf45a6401f6b97f5c2"} Mar 18 16:13:10 crc kubenswrapper[4696]: I0318 16:13:10.600824 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b5lhr" podStartSLOduration=3.343436654 podStartE2EDuration="7.600806579s" podCreationTimestamp="2026-03-18 16:13:03 +0000 UTC" firstStartedPulling="2026-03-18 16:13:05.516870208 +0000 UTC m=+2228.523044414" lastFinishedPulling="2026-03-18 16:13:09.774240123 +0000 UTC m=+2232.780414339" observedRunningTime="2026-03-18 16:13:10.592914081 +0000 UTC m=+2233.599088297" watchObservedRunningTime="2026-03-18 16:13:10.600806579 +0000 UTC m=+2233.606980785" Mar 18 16:13:14 crc kubenswrapper[4696]: I0318 16:13:14.157630 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b5lhr" Mar 18 16:13:14 crc kubenswrapper[4696]: I0318 16:13:14.158261 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b5lhr" Mar 18 16:13:14 crc kubenswrapper[4696]: I0318 16:13:14.206552 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b5lhr" Mar 18 16:13:24 crc kubenswrapper[4696]: I0318 16:13:24.208849 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b5lhr" Mar 18 16:13:24 crc kubenswrapper[4696]: I0318 16:13:24.260304 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b5lhr"] Mar 18 16:13:24 crc kubenswrapper[4696]: I0318 16:13:24.704001 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b5lhr" podUID="bd29f121-6058-48a2-8468-99f6e38dc386" containerName="registry-server" containerID="cri-o://1bc353d1587653633b5e44cbe0a494b36a40020e0b1d1dbf45a6401f6b97f5c2" gracePeriod=2 Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.137480 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b5lhr" Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.224579 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hpkv\" (UniqueName: \"kubernetes.io/projected/bd29f121-6058-48a2-8468-99f6e38dc386-kube-api-access-2hpkv\") pod \"bd29f121-6058-48a2-8468-99f6e38dc386\" (UID: \"bd29f121-6058-48a2-8468-99f6e38dc386\") " Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.224665 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd29f121-6058-48a2-8468-99f6e38dc386-catalog-content\") pod \"bd29f121-6058-48a2-8468-99f6e38dc386\" (UID: \"bd29f121-6058-48a2-8468-99f6e38dc386\") " Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.224919 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd29f121-6058-48a2-8468-99f6e38dc386-utilities\") pod \"bd29f121-6058-48a2-8468-99f6e38dc386\" (UID: \"bd29f121-6058-48a2-8468-99f6e38dc386\") " Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.225883 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd29f121-6058-48a2-8468-99f6e38dc386-utilities" (OuterVolumeSpecName: "utilities") pod "bd29f121-6058-48a2-8468-99f6e38dc386" (UID: "bd29f121-6058-48a2-8468-99f6e38dc386"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.233426 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd29f121-6058-48a2-8468-99f6e38dc386-kube-api-access-2hpkv" (OuterVolumeSpecName: "kube-api-access-2hpkv") pod "bd29f121-6058-48a2-8468-99f6e38dc386" (UID: "bd29f121-6058-48a2-8468-99f6e38dc386"). InnerVolumeSpecName "kube-api-access-2hpkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.276155 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd29f121-6058-48a2-8468-99f6e38dc386-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd29f121-6058-48a2-8468-99f6e38dc386" (UID: "bd29f121-6058-48a2-8468-99f6e38dc386"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.327322 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd29f121-6058-48a2-8468-99f6e38dc386-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.327371 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hpkv\" (UniqueName: \"kubernetes.io/projected/bd29f121-6058-48a2-8468-99f6e38dc386-kube-api-access-2hpkv\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.327398 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd29f121-6058-48a2-8468-99f6e38dc386-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.713477 4696 generic.go:334] "Generic (PLEG): container finished" podID="bd29f121-6058-48a2-8468-99f6e38dc386" containerID="1bc353d1587653633b5e44cbe0a494b36a40020e0b1d1dbf45a6401f6b97f5c2" exitCode=0 Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.713542 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5lhr" event={"ID":"bd29f121-6058-48a2-8468-99f6e38dc386","Type":"ContainerDied","Data":"1bc353d1587653633b5e44cbe0a494b36a40020e0b1d1dbf45a6401f6b97f5c2"} Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.713574 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5lhr" event={"ID":"bd29f121-6058-48a2-8468-99f6e38dc386","Type":"ContainerDied","Data":"625d0d87dc630b5a28c92f1ee1abdcf56327be9d02f382d22bc26acda10a5482"} Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.713595 4696 scope.go:117] "RemoveContainer" containerID="1bc353d1587653633b5e44cbe0a494b36a40020e0b1d1dbf45a6401f6b97f5c2" Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.713745 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b5lhr" Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.738936 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b5lhr"] Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.740283 4696 scope.go:117] "RemoveContainer" containerID="7b1e000c7f6de678a23ac3c3392ce89767bedde70be461f8f4ba2431d35a617d" Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.746669 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b5lhr"] Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.763572 4696 scope.go:117] "RemoveContainer" containerID="a683b43e56180e17851947d754e6fb4fab869ad63654ed88a53a7c137d3f37f4" Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.804236 4696 scope.go:117] "RemoveContainer" containerID="1bc353d1587653633b5e44cbe0a494b36a40020e0b1d1dbf45a6401f6b97f5c2" Mar 18 16:13:25 crc kubenswrapper[4696]: E0318 16:13:25.804779 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bc353d1587653633b5e44cbe0a494b36a40020e0b1d1dbf45a6401f6b97f5c2\": container with ID starting with 1bc353d1587653633b5e44cbe0a494b36a40020e0b1d1dbf45a6401f6b97f5c2 not found: ID does not exist" containerID="1bc353d1587653633b5e44cbe0a494b36a40020e0b1d1dbf45a6401f6b97f5c2" Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.804826 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bc353d1587653633b5e44cbe0a494b36a40020e0b1d1dbf45a6401f6b97f5c2"} err="failed to get container status \"1bc353d1587653633b5e44cbe0a494b36a40020e0b1d1dbf45a6401f6b97f5c2\": rpc error: code = NotFound desc = could not find container \"1bc353d1587653633b5e44cbe0a494b36a40020e0b1d1dbf45a6401f6b97f5c2\": container with ID starting with 1bc353d1587653633b5e44cbe0a494b36a40020e0b1d1dbf45a6401f6b97f5c2 not found: ID does not exist" Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.804850 4696 scope.go:117] "RemoveContainer" containerID="7b1e000c7f6de678a23ac3c3392ce89767bedde70be461f8f4ba2431d35a617d" Mar 18 16:13:25 crc kubenswrapper[4696]: E0318 16:13:25.805397 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b1e000c7f6de678a23ac3c3392ce89767bedde70be461f8f4ba2431d35a617d\": container with ID starting with 7b1e000c7f6de678a23ac3c3392ce89767bedde70be461f8f4ba2431d35a617d not found: ID does not exist" containerID="7b1e000c7f6de678a23ac3c3392ce89767bedde70be461f8f4ba2431d35a617d" Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.805423 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b1e000c7f6de678a23ac3c3392ce89767bedde70be461f8f4ba2431d35a617d"} err="failed to get container status \"7b1e000c7f6de678a23ac3c3392ce89767bedde70be461f8f4ba2431d35a617d\": rpc error: code = NotFound desc = could not find container \"7b1e000c7f6de678a23ac3c3392ce89767bedde70be461f8f4ba2431d35a617d\": container with ID starting with 7b1e000c7f6de678a23ac3c3392ce89767bedde70be461f8f4ba2431d35a617d not found: ID does not exist" Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.805437 4696 scope.go:117] "RemoveContainer" containerID="a683b43e56180e17851947d754e6fb4fab869ad63654ed88a53a7c137d3f37f4" Mar 18 16:13:25 crc kubenswrapper[4696]: E0318 16:13:25.805696 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a683b43e56180e17851947d754e6fb4fab869ad63654ed88a53a7c137d3f37f4\": container with ID starting with a683b43e56180e17851947d754e6fb4fab869ad63654ed88a53a7c137d3f37f4 not found: ID does not exist" containerID="a683b43e56180e17851947d754e6fb4fab869ad63654ed88a53a7c137d3f37f4" Mar 18 16:13:25 crc kubenswrapper[4696]: I0318 16:13:25.805719 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a683b43e56180e17851947d754e6fb4fab869ad63654ed88a53a7c137d3f37f4"} err="failed to get container status \"a683b43e56180e17851947d754e6fb4fab869ad63654ed88a53a7c137d3f37f4\": rpc error: code = NotFound desc = could not find container \"a683b43e56180e17851947d754e6fb4fab869ad63654ed88a53a7c137d3f37f4\": container with ID starting with a683b43e56180e17851947d754e6fb4fab869ad63654ed88a53a7c137d3f37f4 not found: ID does not exist" Mar 18 16:13:27 crc kubenswrapper[4696]: I0318 16:13:27.610666 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd29f121-6058-48a2-8468-99f6e38dc386" path="/var/lib/kubelet/pods/bd29f121-6058-48a2-8468-99f6e38dc386/volumes" Mar 18 16:13:32 crc kubenswrapper[4696]: I0318 16:13:32.783461 4696 generic.go:334] "Generic (PLEG): container finished" podID="a50d071c-8a54-4335-be8c-1842e52dcb81" containerID="759d9320d255f90581554f0ca43265f51439dddcbec5c9de181b8c0c6fbbe863" exitCode=0 Mar 18 16:13:32 crc kubenswrapper[4696]: I0318 16:13:32.783546 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" event={"ID":"a50d071c-8a54-4335-be8c-1842e52dcb81","Type":"ContainerDied","Data":"759d9320d255f90581554f0ca43265f51439dddcbec5c9de181b8c0c6fbbe863"} Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.190176 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.300208 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7zc6\" (UniqueName: \"kubernetes.io/projected/a50d071c-8a54-4335-be8c-1842e52dcb81-kube-api-access-m7zc6\") pod \"a50d071c-8a54-4335-be8c-1842e52dcb81\" (UID: \"a50d071c-8a54-4335-be8c-1842e52dcb81\") " Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.300426 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-nova-metadata-neutron-config-0\") pod \"a50d071c-8a54-4335-be8c-1842e52dcb81\" (UID: \"a50d071c-8a54-4335-be8c-1842e52dcb81\") " Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.300638 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-inventory\") pod \"a50d071c-8a54-4335-be8c-1842e52dcb81\" (UID: \"a50d071c-8a54-4335-be8c-1842e52dcb81\") " Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.300745 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-ssh-key-openstack-edpm-ipam\") pod \"a50d071c-8a54-4335-be8c-1842e52dcb81\" (UID: \"a50d071c-8a54-4335-be8c-1842e52dcb81\") " Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.300764 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-neutron-ovn-metadata-agent-neutron-config-0\") pod \"a50d071c-8a54-4335-be8c-1842e52dcb81\" (UID: \"a50d071c-8a54-4335-be8c-1842e52dcb81\") " Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.300852 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-neutron-metadata-combined-ca-bundle\") pod \"a50d071c-8a54-4335-be8c-1842e52dcb81\" (UID: \"a50d071c-8a54-4335-be8c-1842e52dcb81\") " Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.307414 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a50d071c-8a54-4335-be8c-1842e52dcb81-kube-api-access-m7zc6" (OuterVolumeSpecName: "kube-api-access-m7zc6") pod "a50d071c-8a54-4335-be8c-1842e52dcb81" (UID: "a50d071c-8a54-4335-be8c-1842e52dcb81"). InnerVolumeSpecName "kube-api-access-m7zc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.307580 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "a50d071c-8a54-4335-be8c-1842e52dcb81" (UID: "a50d071c-8a54-4335-be8c-1842e52dcb81"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.331954 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "a50d071c-8a54-4335-be8c-1842e52dcb81" (UID: "a50d071c-8a54-4335-be8c-1842e52dcb81"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.336613 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-inventory" (OuterVolumeSpecName: "inventory") pod "a50d071c-8a54-4335-be8c-1842e52dcb81" (UID: "a50d071c-8a54-4335-be8c-1842e52dcb81"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.337721 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "a50d071c-8a54-4335-be8c-1842e52dcb81" (UID: "a50d071c-8a54-4335-be8c-1842e52dcb81"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.351380 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a50d071c-8a54-4335-be8c-1842e52dcb81" (UID: "a50d071c-8a54-4335-be8c-1842e52dcb81"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.402705 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.402740 4696 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.402754 4696 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.402764 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7zc6\" (UniqueName: \"kubernetes.io/projected/a50d071c-8a54-4335-be8c-1842e52dcb81-kube-api-access-m7zc6\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.402773 4696 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.402782 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a50d071c-8a54-4335-be8c-1842e52dcb81-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.803410 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" event={"ID":"a50d071c-8a54-4335-be8c-1842e52dcb81","Type":"ContainerDied","Data":"cdbbcff0321370ef4e775b27b87a5d26c8bfa1f3c87c957bbd851dd346635bc4"} Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.803461 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdbbcff0321370ef4e775b27b87a5d26c8bfa1f3c87c957bbd851dd346635bc4" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.803421 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.926492 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p"] Mar 18 16:13:34 crc kubenswrapper[4696]: E0318 16:13:34.927046 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd29f121-6058-48a2-8468-99f6e38dc386" containerName="registry-server" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.927070 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd29f121-6058-48a2-8468-99f6e38dc386" containerName="registry-server" Mar 18 16:13:34 crc kubenswrapper[4696]: E0318 16:13:34.927094 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd29f121-6058-48a2-8468-99f6e38dc386" containerName="extract-content" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.927103 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd29f121-6058-48a2-8468-99f6e38dc386" containerName="extract-content" Mar 18 16:13:34 crc kubenswrapper[4696]: E0318 16:13:34.927126 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a50d071c-8a54-4335-be8c-1842e52dcb81" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.927136 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a50d071c-8a54-4335-be8c-1842e52dcb81" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 18 16:13:34 crc kubenswrapper[4696]: E0318 16:13:34.927161 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd29f121-6058-48a2-8468-99f6e38dc386" containerName="extract-utilities" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.927169 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd29f121-6058-48a2-8468-99f6e38dc386" containerName="extract-utilities" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.927374 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a50d071c-8a54-4335-be8c-1842e52dcb81" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.927394 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd29f121-6058-48a2-8468-99f6e38dc386" containerName="registry-server" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.928201 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.933851 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.934951 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g8zsj" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.935136 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.942892 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p"] Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.942939 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:13:34 crc kubenswrapper[4696]: I0318 16:13:34.944674 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:13:35 crc kubenswrapper[4696]: I0318 16:13:35.012963 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ded21247-5107-45ab-9b12-25cb76cdfda3-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p\" (UID: \"ded21247-5107-45ab-9b12-25cb76cdfda3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p" Mar 18 16:13:35 crc kubenswrapper[4696]: I0318 16:13:35.013044 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ded21247-5107-45ab-9b12-25cb76cdfda3-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p\" (UID: \"ded21247-5107-45ab-9b12-25cb76cdfda3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p" Mar 18 16:13:35 crc kubenswrapper[4696]: I0318 16:13:35.013074 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded21247-5107-45ab-9b12-25cb76cdfda3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p\" (UID: \"ded21247-5107-45ab-9b12-25cb76cdfda3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p" Mar 18 16:13:35 crc kubenswrapper[4696]: I0318 16:13:35.013161 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-874pj\" (UniqueName: \"kubernetes.io/projected/ded21247-5107-45ab-9b12-25cb76cdfda3-kube-api-access-874pj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p\" (UID: \"ded21247-5107-45ab-9b12-25cb76cdfda3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p" Mar 18 16:13:35 crc kubenswrapper[4696]: I0318 16:13:35.013189 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ded21247-5107-45ab-9b12-25cb76cdfda3-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p\" (UID: \"ded21247-5107-45ab-9b12-25cb76cdfda3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p" Mar 18 16:13:35 crc kubenswrapper[4696]: I0318 16:13:35.115402 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ded21247-5107-45ab-9b12-25cb76cdfda3-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p\" (UID: \"ded21247-5107-45ab-9b12-25cb76cdfda3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p" Mar 18 16:13:35 crc kubenswrapper[4696]: I0318 16:13:35.115453 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ded21247-5107-45ab-9b12-25cb76cdfda3-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p\" (UID: \"ded21247-5107-45ab-9b12-25cb76cdfda3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p" Mar 18 16:13:35 crc kubenswrapper[4696]: I0318 16:13:35.115471 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded21247-5107-45ab-9b12-25cb76cdfda3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p\" (UID: \"ded21247-5107-45ab-9b12-25cb76cdfda3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p" Mar 18 16:13:35 crc kubenswrapper[4696]: I0318 16:13:35.115529 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-874pj\" (UniqueName: \"kubernetes.io/projected/ded21247-5107-45ab-9b12-25cb76cdfda3-kube-api-access-874pj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p\" (UID: \"ded21247-5107-45ab-9b12-25cb76cdfda3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p" Mar 18 16:13:35 crc kubenswrapper[4696]: I0318 16:13:35.115549 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ded21247-5107-45ab-9b12-25cb76cdfda3-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p\" (UID: \"ded21247-5107-45ab-9b12-25cb76cdfda3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p" Mar 18 16:13:35 crc kubenswrapper[4696]: I0318 16:13:35.118730 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ded21247-5107-45ab-9b12-25cb76cdfda3-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p\" (UID: \"ded21247-5107-45ab-9b12-25cb76cdfda3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p" Mar 18 16:13:35 crc kubenswrapper[4696]: I0318 16:13:35.119784 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded21247-5107-45ab-9b12-25cb76cdfda3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p\" (UID: \"ded21247-5107-45ab-9b12-25cb76cdfda3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p" Mar 18 16:13:35 crc kubenswrapper[4696]: I0318 16:13:35.120166 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ded21247-5107-45ab-9b12-25cb76cdfda3-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p\" (UID: \"ded21247-5107-45ab-9b12-25cb76cdfda3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p" Mar 18 16:13:35 crc kubenswrapper[4696]: I0318 16:13:35.132481 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ded21247-5107-45ab-9b12-25cb76cdfda3-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p\" (UID: \"ded21247-5107-45ab-9b12-25cb76cdfda3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p" Mar 18 16:13:35 crc kubenswrapper[4696]: I0318 16:13:35.138934 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-874pj\" (UniqueName: \"kubernetes.io/projected/ded21247-5107-45ab-9b12-25cb76cdfda3-kube-api-access-874pj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p\" (UID: \"ded21247-5107-45ab-9b12-25cb76cdfda3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p" Mar 18 16:13:35 crc kubenswrapper[4696]: I0318 16:13:35.249853 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p" Mar 18 16:13:35 crc kubenswrapper[4696]: I0318 16:13:35.826372 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p"] Mar 18 16:13:35 crc kubenswrapper[4696]: W0318 16:13:35.836465 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded21247_5107_45ab_9b12_25cb76cdfda3.slice/crio-13940cc5afa4cc58a89f3057027fe47edc6dd9f273b636e00935e2eecbf22a6f WatchSource:0}: Error finding container 13940cc5afa4cc58a89f3057027fe47edc6dd9f273b636e00935e2eecbf22a6f: Status 404 returned error can't find the container with id 13940cc5afa4cc58a89f3057027fe47edc6dd9f273b636e00935e2eecbf22a6f Mar 18 16:13:36 crc kubenswrapper[4696]: I0318 16:13:36.824351 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p" event={"ID":"ded21247-5107-45ab-9b12-25cb76cdfda3","Type":"ContainerStarted","Data":"99be9e3f32bebf0c74f45e69d7319132f2068ec08bbe377b5ba5730dff551724"} Mar 18 16:13:36 crc kubenswrapper[4696]: I0318 16:13:36.826655 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p" event={"ID":"ded21247-5107-45ab-9b12-25cb76cdfda3","Type":"ContainerStarted","Data":"13940cc5afa4cc58a89f3057027fe47edc6dd9f273b636e00935e2eecbf22a6f"} Mar 18 16:13:36 crc kubenswrapper[4696]: I0318 16:13:36.847065 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p" podStartSLOduration=2.676895726 podStartE2EDuration="2.847034207s" podCreationTimestamp="2026-03-18 16:13:34 +0000 UTC" firstStartedPulling="2026-03-18 16:13:35.839712825 +0000 UTC m=+2258.845887041" lastFinishedPulling="2026-03-18 16:13:36.009851316 +0000 UTC m=+2259.016025522" observedRunningTime="2026-03-18 16:13:36.845209371 +0000 UTC m=+2259.851383577" watchObservedRunningTime="2026-03-18 16:13:36.847034207 +0000 UTC m=+2259.853208413" Mar 18 16:13:42 crc kubenswrapper[4696]: I0318 16:13:42.184930 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:13:42 crc kubenswrapper[4696]: I0318 16:13:42.185595 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:14:00 crc kubenswrapper[4696]: I0318 16:14:00.136035 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564174-jf5lv"] Mar 18 16:14:00 crc kubenswrapper[4696]: I0318 16:14:00.138178 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564174-jf5lv" Mar 18 16:14:00 crc kubenswrapper[4696]: I0318 16:14:00.141628 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:14:00 crc kubenswrapper[4696]: I0318 16:14:00.141827 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:14:00 crc kubenswrapper[4696]: I0318 16:14:00.145888 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564174-jf5lv"] Mar 18 16:14:00 crc kubenswrapper[4696]: I0318 16:14:00.148083 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:14:00 crc kubenswrapper[4696]: I0318 16:14:00.310350 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw9sz\" (UniqueName: \"kubernetes.io/projected/a44086b8-ccf9-4372-8350-f37c793dcc13-kube-api-access-tw9sz\") pod \"auto-csr-approver-29564174-jf5lv\" (UID: \"a44086b8-ccf9-4372-8350-f37c793dcc13\") " pod="openshift-infra/auto-csr-approver-29564174-jf5lv" Mar 18 16:14:00 crc kubenswrapper[4696]: I0318 16:14:00.411910 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw9sz\" (UniqueName: \"kubernetes.io/projected/a44086b8-ccf9-4372-8350-f37c793dcc13-kube-api-access-tw9sz\") pod \"auto-csr-approver-29564174-jf5lv\" (UID: \"a44086b8-ccf9-4372-8350-f37c793dcc13\") " pod="openshift-infra/auto-csr-approver-29564174-jf5lv" Mar 18 16:14:00 crc kubenswrapper[4696]: I0318 16:14:00.429823 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw9sz\" (UniqueName: \"kubernetes.io/projected/a44086b8-ccf9-4372-8350-f37c793dcc13-kube-api-access-tw9sz\") pod \"auto-csr-approver-29564174-jf5lv\" (UID: \"a44086b8-ccf9-4372-8350-f37c793dcc13\") " pod="openshift-infra/auto-csr-approver-29564174-jf5lv" Mar 18 16:14:00 crc kubenswrapper[4696]: I0318 16:14:00.458266 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564174-jf5lv" Mar 18 16:14:00 crc kubenswrapper[4696]: I0318 16:14:00.907201 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564174-jf5lv"] Mar 18 16:14:00 crc kubenswrapper[4696]: W0318 16:14:00.908364 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda44086b8_ccf9_4372_8350_f37c793dcc13.slice/crio-a73035d747bc6d09105c4cc274a189e36f657d256dcaf74617d40e89ff62dee1 WatchSource:0}: Error finding container a73035d747bc6d09105c4cc274a189e36f657d256dcaf74617d40e89ff62dee1: Status 404 returned error can't find the container with id a73035d747bc6d09105c4cc274a189e36f657d256dcaf74617d40e89ff62dee1 Mar 18 16:14:01 crc kubenswrapper[4696]: I0318 16:14:01.036278 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564174-jf5lv" event={"ID":"a44086b8-ccf9-4372-8350-f37c793dcc13","Type":"ContainerStarted","Data":"a73035d747bc6d09105c4cc274a189e36f657d256dcaf74617d40e89ff62dee1"} Mar 18 16:14:03 crc kubenswrapper[4696]: I0318 16:14:03.059059 4696 generic.go:334] "Generic (PLEG): container finished" podID="a44086b8-ccf9-4372-8350-f37c793dcc13" containerID="4d495557875b3aa4b741621151db02aa7f44308c38106614a57a9c60c788c29d" exitCode=0 Mar 18 16:14:03 crc kubenswrapper[4696]: I0318 16:14:03.059158 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564174-jf5lv" event={"ID":"a44086b8-ccf9-4372-8350-f37c793dcc13","Type":"ContainerDied","Data":"4d495557875b3aa4b741621151db02aa7f44308c38106614a57a9c60c788c29d"} Mar 18 16:14:04 crc kubenswrapper[4696]: I0318 16:14:04.401976 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564174-jf5lv" Mar 18 16:14:04 crc kubenswrapper[4696]: I0318 16:14:04.594715 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw9sz\" (UniqueName: \"kubernetes.io/projected/a44086b8-ccf9-4372-8350-f37c793dcc13-kube-api-access-tw9sz\") pod \"a44086b8-ccf9-4372-8350-f37c793dcc13\" (UID: \"a44086b8-ccf9-4372-8350-f37c793dcc13\") " Mar 18 16:14:04 crc kubenswrapper[4696]: I0318 16:14:04.600789 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a44086b8-ccf9-4372-8350-f37c793dcc13-kube-api-access-tw9sz" (OuterVolumeSpecName: "kube-api-access-tw9sz") pod "a44086b8-ccf9-4372-8350-f37c793dcc13" (UID: "a44086b8-ccf9-4372-8350-f37c793dcc13"). InnerVolumeSpecName "kube-api-access-tw9sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:14:04 crc kubenswrapper[4696]: I0318 16:14:04.698613 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw9sz\" (UniqueName: \"kubernetes.io/projected/a44086b8-ccf9-4372-8350-f37c793dcc13-kube-api-access-tw9sz\") on node \"crc\" DevicePath \"\"" Mar 18 16:14:05 crc kubenswrapper[4696]: I0318 16:14:05.078681 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564174-jf5lv" event={"ID":"a44086b8-ccf9-4372-8350-f37c793dcc13","Type":"ContainerDied","Data":"a73035d747bc6d09105c4cc274a189e36f657d256dcaf74617d40e89ff62dee1"} Mar 18 16:14:05 crc kubenswrapper[4696]: I0318 16:14:05.078730 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a73035d747bc6d09105c4cc274a189e36f657d256dcaf74617d40e89ff62dee1" Mar 18 16:14:05 crc kubenswrapper[4696]: I0318 16:14:05.078759 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564174-jf5lv" Mar 18 16:14:05 crc kubenswrapper[4696]: I0318 16:14:05.477563 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564168-46r9b"] Mar 18 16:14:05 crc kubenswrapper[4696]: I0318 16:14:05.485916 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564168-46r9b"] Mar 18 16:14:05 crc kubenswrapper[4696]: I0318 16:14:05.609297 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f401c295-b79a-4193-a1a7-f40d9b42a96a" path="/var/lib/kubelet/pods/f401c295-b79a-4193-a1a7-f40d9b42a96a/volumes" Mar 18 16:14:12 crc kubenswrapper[4696]: I0318 16:14:12.185026 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:14:12 crc kubenswrapper[4696]: I0318 16:14:12.185518 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:14:22 crc kubenswrapper[4696]: I0318 16:14:22.964804 4696 scope.go:117] "RemoveContainer" containerID="686791b4014df2b99fd1a3edc3c6f046c87e536e7aa1d1f1d3f5e057a1771481" Mar 18 16:14:42 crc kubenswrapper[4696]: I0318 16:14:42.184044 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:14:42 crc kubenswrapper[4696]: I0318 16:14:42.184619 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:14:42 crc kubenswrapper[4696]: I0318 16:14:42.184664 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 16:14:42 crc kubenswrapper[4696]: I0318 16:14:42.185296 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f"} pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:14:42 crc kubenswrapper[4696]: I0318 16:14:42.185339 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" containerID="cri-o://9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" gracePeriod=600 Mar 18 16:14:42 crc kubenswrapper[4696]: E0318 16:14:42.308190 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:14:42 crc kubenswrapper[4696]: I0318 16:14:42.394509 4696 generic.go:334] "Generic (PLEG): container finished" podID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" exitCode=0 Mar 18 16:14:42 crc kubenswrapper[4696]: I0318 16:14:42.394572 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerDied","Data":"9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f"} Mar 18 16:14:42 crc kubenswrapper[4696]: I0318 16:14:42.394825 4696 scope.go:117] "RemoveContainer" containerID="418d22c717d701d8c2458f1bdfe29acc3f5066a0a4b0d71d5979be21ee6f29d7" Mar 18 16:14:42 crc kubenswrapper[4696]: I0318 16:14:42.395491 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:14:42 crc kubenswrapper[4696]: E0318 16:14:42.395753 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:14:53 crc kubenswrapper[4696]: I0318 16:14:53.598086 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:14:53 crc kubenswrapper[4696]: E0318 16:14:53.598910 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:15:00 crc kubenswrapper[4696]: I0318 16:15:00.145593 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564175-pzs7m"] Mar 18 16:15:00 crc kubenswrapper[4696]: E0318 16:15:00.146692 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a44086b8-ccf9-4372-8350-f37c793dcc13" containerName="oc" Mar 18 16:15:00 crc kubenswrapper[4696]: I0318 16:15:00.146711 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a44086b8-ccf9-4372-8350-f37c793dcc13" containerName="oc" Mar 18 16:15:00 crc kubenswrapper[4696]: I0318 16:15:00.146984 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a44086b8-ccf9-4372-8350-f37c793dcc13" containerName="oc" Mar 18 16:15:00 crc kubenswrapper[4696]: I0318 16:15:00.147742 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-pzs7m" Mar 18 16:15:00 crc kubenswrapper[4696]: I0318 16:15:00.150406 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 16:15:00 crc kubenswrapper[4696]: I0318 16:15:00.150659 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 16:15:00 crc kubenswrapper[4696]: I0318 16:15:00.158840 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564175-pzs7m"] Mar 18 16:15:00 crc kubenswrapper[4696]: I0318 16:15:00.161991 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvzsm\" (UniqueName: \"kubernetes.io/projected/c569b02b-8d94-413a-8beb-acc7091935cc-kube-api-access-qvzsm\") pod \"collect-profiles-29564175-pzs7m\" (UID: \"c569b02b-8d94-413a-8beb-acc7091935cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-pzs7m" Mar 18 16:15:00 crc kubenswrapper[4696]: I0318 16:15:00.162154 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c569b02b-8d94-413a-8beb-acc7091935cc-config-volume\") pod \"collect-profiles-29564175-pzs7m\" (UID: \"c569b02b-8d94-413a-8beb-acc7091935cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-pzs7m" Mar 18 16:15:00 crc kubenswrapper[4696]: I0318 16:15:00.162197 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c569b02b-8d94-413a-8beb-acc7091935cc-secret-volume\") pod \"collect-profiles-29564175-pzs7m\" (UID: \"c569b02b-8d94-413a-8beb-acc7091935cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-pzs7m" Mar 18 16:15:00 crc kubenswrapper[4696]: I0318 16:15:00.264900 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvzsm\" (UniqueName: \"kubernetes.io/projected/c569b02b-8d94-413a-8beb-acc7091935cc-kube-api-access-qvzsm\") pod \"collect-profiles-29564175-pzs7m\" (UID: \"c569b02b-8d94-413a-8beb-acc7091935cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-pzs7m" Mar 18 16:15:00 crc kubenswrapper[4696]: I0318 16:15:00.265042 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c569b02b-8d94-413a-8beb-acc7091935cc-config-volume\") pod \"collect-profiles-29564175-pzs7m\" (UID: \"c569b02b-8d94-413a-8beb-acc7091935cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-pzs7m" Mar 18 16:15:00 crc kubenswrapper[4696]: I0318 16:15:00.265092 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c569b02b-8d94-413a-8beb-acc7091935cc-secret-volume\") pod \"collect-profiles-29564175-pzs7m\" (UID: \"c569b02b-8d94-413a-8beb-acc7091935cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-pzs7m" Mar 18 16:15:00 crc kubenswrapper[4696]: I0318 16:15:00.266086 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c569b02b-8d94-413a-8beb-acc7091935cc-config-volume\") pod \"collect-profiles-29564175-pzs7m\" (UID: \"c569b02b-8d94-413a-8beb-acc7091935cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-pzs7m" Mar 18 16:15:00 crc kubenswrapper[4696]: I0318 16:15:00.272554 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c569b02b-8d94-413a-8beb-acc7091935cc-secret-volume\") pod \"collect-profiles-29564175-pzs7m\" (UID: \"c569b02b-8d94-413a-8beb-acc7091935cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-pzs7m" Mar 18 16:15:00 crc kubenswrapper[4696]: I0318 16:15:00.286086 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvzsm\" (UniqueName: \"kubernetes.io/projected/c569b02b-8d94-413a-8beb-acc7091935cc-kube-api-access-qvzsm\") pod \"collect-profiles-29564175-pzs7m\" (UID: \"c569b02b-8d94-413a-8beb-acc7091935cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-pzs7m" Mar 18 16:15:00 crc kubenswrapper[4696]: I0318 16:15:00.469613 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-pzs7m" Mar 18 16:15:00 crc kubenswrapper[4696]: I0318 16:15:00.946853 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564175-pzs7m"] Mar 18 16:15:01 crc kubenswrapper[4696]: I0318 16:15:01.595920 4696 generic.go:334] "Generic (PLEG): container finished" podID="c569b02b-8d94-413a-8beb-acc7091935cc" containerID="24e85081aa2d8764a3ee85000fa31d883182c5e9f6d1e1040e6adad913485433" exitCode=0 Mar 18 16:15:01 crc kubenswrapper[4696]: I0318 16:15:01.596016 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-pzs7m" event={"ID":"c569b02b-8d94-413a-8beb-acc7091935cc","Type":"ContainerDied","Data":"24e85081aa2d8764a3ee85000fa31d883182c5e9f6d1e1040e6adad913485433"} Mar 18 16:15:01 crc kubenswrapper[4696]: I0318 16:15:01.596206 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-pzs7m" event={"ID":"c569b02b-8d94-413a-8beb-acc7091935cc","Type":"ContainerStarted","Data":"d3a49b811eeb65572a240ef5e30c070f489c31921f554bb39ef521be99d71b28"} Mar 18 16:15:02 crc kubenswrapper[4696]: I0318 16:15:02.923437 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-pzs7m" Mar 18 16:15:03 crc kubenswrapper[4696]: I0318 16:15:03.023621 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvzsm\" (UniqueName: \"kubernetes.io/projected/c569b02b-8d94-413a-8beb-acc7091935cc-kube-api-access-qvzsm\") pod \"c569b02b-8d94-413a-8beb-acc7091935cc\" (UID: \"c569b02b-8d94-413a-8beb-acc7091935cc\") " Mar 18 16:15:03 crc kubenswrapper[4696]: I0318 16:15:03.023665 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c569b02b-8d94-413a-8beb-acc7091935cc-secret-volume\") pod \"c569b02b-8d94-413a-8beb-acc7091935cc\" (UID: \"c569b02b-8d94-413a-8beb-acc7091935cc\") " Mar 18 16:15:03 crc kubenswrapper[4696]: I0318 16:15:03.023785 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c569b02b-8d94-413a-8beb-acc7091935cc-config-volume\") pod \"c569b02b-8d94-413a-8beb-acc7091935cc\" (UID: \"c569b02b-8d94-413a-8beb-acc7091935cc\") " Mar 18 16:15:03 crc kubenswrapper[4696]: I0318 16:15:03.024647 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c569b02b-8d94-413a-8beb-acc7091935cc-config-volume" (OuterVolumeSpecName: "config-volume") pod "c569b02b-8d94-413a-8beb-acc7091935cc" (UID: "c569b02b-8d94-413a-8beb-acc7091935cc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:15:03 crc kubenswrapper[4696]: I0318 16:15:03.025010 4696 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c569b02b-8d94-413a-8beb-acc7091935cc-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 16:15:03 crc kubenswrapper[4696]: I0318 16:15:03.031068 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c569b02b-8d94-413a-8beb-acc7091935cc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c569b02b-8d94-413a-8beb-acc7091935cc" (UID: "c569b02b-8d94-413a-8beb-acc7091935cc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:15:03 crc kubenswrapper[4696]: I0318 16:15:03.031081 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c569b02b-8d94-413a-8beb-acc7091935cc-kube-api-access-qvzsm" (OuterVolumeSpecName: "kube-api-access-qvzsm") pod "c569b02b-8d94-413a-8beb-acc7091935cc" (UID: "c569b02b-8d94-413a-8beb-acc7091935cc"). InnerVolumeSpecName "kube-api-access-qvzsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:15:03 crc kubenswrapper[4696]: I0318 16:15:03.127325 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvzsm\" (UniqueName: \"kubernetes.io/projected/c569b02b-8d94-413a-8beb-acc7091935cc-kube-api-access-qvzsm\") on node \"crc\" DevicePath \"\"" Mar 18 16:15:03 crc kubenswrapper[4696]: I0318 16:15:03.127700 4696 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c569b02b-8d94-413a-8beb-acc7091935cc-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 16:15:03 crc kubenswrapper[4696]: I0318 16:15:03.614324 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-pzs7m" event={"ID":"c569b02b-8d94-413a-8beb-acc7091935cc","Type":"ContainerDied","Data":"d3a49b811eeb65572a240ef5e30c070f489c31921f554bb39ef521be99d71b28"} Mar 18 16:15:03 crc kubenswrapper[4696]: I0318 16:15:03.614645 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3a49b811eeb65572a240ef5e30c070f489c31921f554bb39ef521be99d71b28" Mar 18 16:15:03 crc kubenswrapper[4696]: I0318 16:15:03.614355 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-pzs7m" Mar 18 16:15:03 crc kubenswrapper[4696]: I0318 16:15:03.994665 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564130-4tdt6"] Mar 18 16:15:04 crc kubenswrapper[4696]: I0318 16:15:04.003066 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564130-4tdt6"] Mar 18 16:15:05 crc kubenswrapper[4696]: I0318 16:15:05.610350 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8339464b-c883-44d7-95eb-57c32689e91b" path="/var/lib/kubelet/pods/8339464b-c883-44d7-95eb-57c32689e91b/volumes" Mar 18 16:15:06 crc kubenswrapper[4696]: I0318 16:15:06.597286 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:15:06 crc kubenswrapper[4696]: E0318 16:15:06.597706 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:15:18 crc kubenswrapper[4696]: I0318 16:15:18.597656 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:15:18 crc kubenswrapper[4696]: E0318 16:15:18.598470 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:15:23 crc kubenswrapper[4696]: I0318 16:15:23.065420 4696 scope.go:117] "RemoveContainer" containerID="e3f8d242f60c1385fcd94796dad9cc2ca4dab05bff3bd93122f0a873f540d475" Mar 18 16:15:31 crc kubenswrapper[4696]: I0318 16:15:31.597361 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:15:31 crc kubenswrapper[4696]: E0318 16:15:31.598451 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:15:42 crc kubenswrapper[4696]: I0318 16:15:42.597114 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:15:42 crc kubenswrapper[4696]: E0318 16:15:42.597822 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:15:55 crc kubenswrapper[4696]: I0318 16:15:55.597612 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:15:55 crc kubenswrapper[4696]: E0318 16:15:55.598349 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:16:00 crc kubenswrapper[4696]: I0318 16:16:00.139692 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564176-mrsqq"] Mar 18 16:16:00 crc kubenswrapper[4696]: E0318 16:16:00.140787 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c569b02b-8d94-413a-8beb-acc7091935cc" containerName="collect-profiles" Mar 18 16:16:00 crc kubenswrapper[4696]: I0318 16:16:00.140807 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c569b02b-8d94-413a-8beb-acc7091935cc" containerName="collect-profiles" Mar 18 16:16:00 crc kubenswrapper[4696]: I0318 16:16:00.141131 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c569b02b-8d94-413a-8beb-acc7091935cc" containerName="collect-profiles" Mar 18 16:16:00 crc kubenswrapper[4696]: I0318 16:16:00.141939 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564176-mrsqq" Mar 18 16:16:00 crc kubenswrapper[4696]: I0318 16:16:00.146011 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:16:00 crc kubenswrapper[4696]: I0318 16:16:00.146041 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:16:00 crc kubenswrapper[4696]: I0318 16:16:00.146168 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:16:00 crc kubenswrapper[4696]: I0318 16:16:00.150383 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564176-mrsqq"] Mar 18 16:16:00 crc kubenswrapper[4696]: I0318 16:16:00.296968 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5gmf\" (UniqueName: \"kubernetes.io/projected/10ff9679-d93d-4d70-b458-12b21f8097d4-kube-api-access-t5gmf\") pod \"auto-csr-approver-29564176-mrsqq\" (UID: \"10ff9679-d93d-4d70-b458-12b21f8097d4\") " pod="openshift-infra/auto-csr-approver-29564176-mrsqq" Mar 18 16:16:00 crc kubenswrapper[4696]: I0318 16:16:00.398583 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5gmf\" (UniqueName: \"kubernetes.io/projected/10ff9679-d93d-4d70-b458-12b21f8097d4-kube-api-access-t5gmf\") pod \"auto-csr-approver-29564176-mrsqq\" (UID: \"10ff9679-d93d-4d70-b458-12b21f8097d4\") " pod="openshift-infra/auto-csr-approver-29564176-mrsqq" Mar 18 16:16:00 crc kubenswrapper[4696]: I0318 16:16:00.417227 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5gmf\" (UniqueName: \"kubernetes.io/projected/10ff9679-d93d-4d70-b458-12b21f8097d4-kube-api-access-t5gmf\") pod \"auto-csr-approver-29564176-mrsqq\" (UID: \"10ff9679-d93d-4d70-b458-12b21f8097d4\") " pod="openshift-infra/auto-csr-approver-29564176-mrsqq" Mar 18 16:16:00 crc kubenswrapper[4696]: I0318 16:16:00.461789 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564176-mrsqq" Mar 18 16:16:00 crc kubenswrapper[4696]: I0318 16:16:00.922705 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564176-mrsqq"] Mar 18 16:16:01 crc kubenswrapper[4696]: I0318 16:16:01.165254 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564176-mrsqq" event={"ID":"10ff9679-d93d-4d70-b458-12b21f8097d4","Type":"ContainerStarted","Data":"3ec6f1bfcba6653fabf09be49e0fa6ca4be976fbc87145223065ca48e190df8d"} Mar 18 16:16:03 crc kubenswrapper[4696]: I0318 16:16:03.182566 4696 generic.go:334] "Generic (PLEG): container finished" podID="10ff9679-d93d-4d70-b458-12b21f8097d4" containerID="f0a5725eeec6b69afec35477fda15f34b84ec462ed65695886a51560cf5251ca" exitCode=0 Mar 18 16:16:03 crc kubenswrapper[4696]: I0318 16:16:03.182640 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564176-mrsqq" event={"ID":"10ff9679-d93d-4d70-b458-12b21f8097d4","Type":"ContainerDied","Data":"f0a5725eeec6b69afec35477fda15f34b84ec462ed65695886a51560cf5251ca"} Mar 18 16:16:04 crc kubenswrapper[4696]: I0318 16:16:04.503372 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564176-mrsqq" Mar 18 16:16:04 crc kubenswrapper[4696]: I0318 16:16:04.693895 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5gmf\" (UniqueName: \"kubernetes.io/projected/10ff9679-d93d-4d70-b458-12b21f8097d4-kube-api-access-t5gmf\") pod \"10ff9679-d93d-4d70-b458-12b21f8097d4\" (UID: \"10ff9679-d93d-4d70-b458-12b21f8097d4\") " Mar 18 16:16:04 crc kubenswrapper[4696]: I0318 16:16:04.702788 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10ff9679-d93d-4d70-b458-12b21f8097d4-kube-api-access-t5gmf" (OuterVolumeSpecName: "kube-api-access-t5gmf") pod "10ff9679-d93d-4d70-b458-12b21f8097d4" (UID: "10ff9679-d93d-4d70-b458-12b21f8097d4"). InnerVolumeSpecName "kube-api-access-t5gmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:16:04 crc kubenswrapper[4696]: I0318 16:16:04.796339 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5gmf\" (UniqueName: \"kubernetes.io/projected/10ff9679-d93d-4d70-b458-12b21f8097d4-kube-api-access-t5gmf\") on node \"crc\" DevicePath \"\"" Mar 18 16:16:05 crc kubenswrapper[4696]: I0318 16:16:05.210986 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564176-mrsqq" event={"ID":"10ff9679-d93d-4d70-b458-12b21f8097d4","Type":"ContainerDied","Data":"3ec6f1bfcba6653fabf09be49e0fa6ca4be976fbc87145223065ca48e190df8d"} Mar 18 16:16:05 crc kubenswrapper[4696]: I0318 16:16:05.211038 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564176-mrsqq" Mar 18 16:16:05 crc kubenswrapper[4696]: I0318 16:16:05.211052 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ec6f1bfcba6653fabf09be49e0fa6ca4be976fbc87145223065ca48e190df8d" Mar 18 16:16:05 crc kubenswrapper[4696]: I0318 16:16:05.580065 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564170-wgdnn"] Mar 18 16:16:05 crc kubenswrapper[4696]: I0318 16:16:05.590602 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564170-wgdnn"] Mar 18 16:16:05 crc kubenswrapper[4696]: I0318 16:16:05.610240 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="235ffb53-2fcd-4d61-8f87-f5a687615356" path="/var/lib/kubelet/pods/235ffb53-2fcd-4d61-8f87-f5a687615356/volumes" Mar 18 16:16:08 crc kubenswrapper[4696]: I0318 16:16:08.596838 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:16:08 crc kubenswrapper[4696]: E0318 16:16:08.597310 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:16:20 crc kubenswrapper[4696]: I0318 16:16:20.598596 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:16:20 crc kubenswrapper[4696]: E0318 16:16:20.599940 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:16:23 crc kubenswrapper[4696]: I0318 16:16:23.136366 4696 scope.go:117] "RemoveContainer" containerID="f4eaf0eca27fa84a780faeac7fbacda4d2b69d486c5a106b9236c5940c2c0ce2" Mar 18 16:16:31 crc kubenswrapper[4696]: I0318 16:16:31.597990 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:16:31 crc kubenswrapper[4696]: E0318 16:16:31.598762 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:16:44 crc kubenswrapper[4696]: I0318 16:16:44.598388 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:16:44 crc kubenswrapper[4696]: E0318 16:16:44.599347 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:16:59 crc kubenswrapper[4696]: I0318 16:16:59.598634 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:16:59 crc kubenswrapper[4696]: E0318 16:16:59.599398 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:17:12 crc kubenswrapper[4696]: I0318 16:17:12.597348 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:17:12 crc kubenswrapper[4696]: E0318 16:17:12.598275 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:17:20 crc kubenswrapper[4696]: I0318 16:17:20.109409 4696 generic.go:334] "Generic (PLEG): container finished" podID="ded21247-5107-45ab-9b12-25cb76cdfda3" containerID="99be9e3f32bebf0c74f45e69d7319132f2068ec08bbe377b5ba5730dff551724" exitCode=0 Mar 18 16:17:20 crc kubenswrapper[4696]: I0318 16:17:20.109642 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p" event={"ID":"ded21247-5107-45ab-9b12-25cb76cdfda3","Type":"ContainerDied","Data":"99be9e3f32bebf0c74f45e69d7319132f2068ec08bbe377b5ba5730dff551724"} Mar 18 16:17:21 crc kubenswrapper[4696]: I0318 16:17:21.516305 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p" Mar 18 16:17:21 crc kubenswrapper[4696]: I0318 16:17:21.549803 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ded21247-5107-45ab-9b12-25cb76cdfda3-inventory\") pod \"ded21247-5107-45ab-9b12-25cb76cdfda3\" (UID: \"ded21247-5107-45ab-9b12-25cb76cdfda3\") " Mar 18 16:17:21 crc kubenswrapper[4696]: I0318 16:17:21.549886 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded21247-5107-45ab-9b12-25cb76cdfda3-libvirt-combined-ca-bundle\") pod \"ded21247-5107-45ab-9b12-25cb76cdfda3\" (UID: \"ded21247-5107-45ab-9b12-25cb76cdfda3\") " Mar 18 16:17:21 crc kubenswrapper[4696]: I0318 16:17:21.549911 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ded21247-5107-45ab-9b12-25cb76cdfda3-libvirt-secret-0\") pod \"ded21247-5107-45ab-9b12-25cb76cdfda3\" (UID: \"ded21247-5107-45ab-9b12-25cb76cdfda3\") " Mar 18 16:17:21 crc kubenswrapper[4696]: I0318 16:17:21.550074 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-874pj\" (UniqueName: \"kubernetes.io/projected/ded21247-5107-45ab-9b12-25cb76cdfda3-kube-api-access-874pj\") pod \"ded21247-5107-45ab-9b12-25cb76cdfda3\" (UID: \"ded21247-5107-45ab-9b12-25cb76cdfda3\") " Mar 18 16:17:21 crc kubenswrapper[4696]: I0318 16:17:21.550117 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ded21247-5107-45ab-9b12-25cb76cdfda3-ssh-key-openstack-edpm-ipam\") pod \"ded21247-5107-45ab-9b12-25cb76cdfda3\" (UID: \"ded21247-5107-45ab-9b12-25cb76cdfda3\") " Mar 18 16:17:21 crc kubenswrapper[4696]: I0318 16:17:21.555408 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded21247-5107-45ab-9b12-25cb76cdfda3-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "ded21247-5107-45ab-9b12-25cb76cdfda3" (UID: "ded21247-5107-45ab-9b12-25cb76cdfda3"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:17:21 crc kubenswrapper[4696]: I0318 16:17:21.556228 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ded21247-5107-45ab-9b12-25cb76cdfda3-kube-api-access-874pj" (OuterVolumeSpecName: "kube-api-access-874pj") pod "ded21247-5107-45ab-9b12-25cb76cdfda3" (UID: "ded21247-5107-45ab-9b12-25cb76cdfda3"). InnerVolumeSpecName "kube-api-access-874pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:17:21 crc kubenswrapper[4696]: I0318 16:17:21.579900 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded21247-5107-45ab-9b12-25cb76cdfda3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ded21247-5107-45ab-9b12-25cb76cdfda3" (UID: "ded21247-5107-45ab-9b12-25cb76cdfda3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:17:21 crc kubenswrapper[4696]: I0318 16:17:21.580370 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded21247-5107-45ab-9b12-25cb76cdfda3-inventory" (OuterVolumeSpecName: "inventory") pod "ded21247-5107-45ab-9b12-25cb76cdfda3" (UID: "ded21247-5107-45ab-9b12-25cb76cdfda3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:17:21 crc kubenswrapper[4696]: I0318 16:17:21.582220 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded21247-5107-45ab-9b12-25cb76cdfda3-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "ded21247-5107-45ab-9b12-25cb76cdfda3" (UID: "ded21247-5107-45ab-9b12-25cb76cdfda3"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:17:21 crc kubenswrapper[4696]: I0318 16:17:21.652762 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-874pj\" (UniqueName: \"kubernetes.io/projected/ded21247-5107-45ab-9b12-25cb76cdfda3-kube-api-access-874pj\") on node \"crc\" DevicePath \"\"" Mar 18 16:17:21 crc kubenswrapper[4696]: I0318 16:17:21.652798 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ded21247-5107-45ab-9b12-25cb76cdfda3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:17:21 crc kubenswrapper[4696]: I0318 16:17:21.652810 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ded21247-5107-45ab-9b12-25cb76cdfda3-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:17:21 crc kubenswrapper[4696]: I0318 16:17:21.652819 4696 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded21247-5107-45ab-9b12-25cb76cdfda3-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:17:21 crc kubenswrapper[4696]: I0318 16:17:21.652829 4696 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ded21247-5107-45ab-9b12-25cb76cdfda3-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.126295 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p" event={"ID":"ded21247-5107-45ab-9b12-25cb76cdfda3","Type":"ContainerDied","Data":"13940cc5afa4cc58a89f3057027fe47edc6dd9f273b636e00935e2eecbf22a6f"} Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.126349 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13940cc5afa4cc58a89f3057027fe47edc6dd9f273b636e00935e2eecbf22a6f" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.126323 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.218690 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh"] Mar 18 16:17:22 crc kubenswrapper[4696]: E0318 16:17:22.219029 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ded21247-5107-45ab-9b12-25cb76cdfda3" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.219046 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded21247-5107-45ab-9b12-25cb76cdfda3" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 16:17:22 crc kubenswrapper[4696]: E0318 16:17:22.219088 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10ff9679-d93d-4d70-b458-12b21f8097d4" containerName="oc" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.219095 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="10ff9679-d93d-4d70-b458-12b21f8097d4" containerName="oc" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.219254 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="10ff9679-d93d-4d70-b458-12b21f8097d4" containerName="oc" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.219281 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="ded21247-5107-45ab-9b12-25cb76cdfda3" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.219910 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.222123 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.222240 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.225646 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g8zsj" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.225684 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.225724 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.225901 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.228912 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh"] Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.233492 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.386077 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.386275 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.386704 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.386780 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.386906 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.386978 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/57f3ea1b-d23e-435c-826f-539c401753be-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.387047 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.387191 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x446k\" (UniqueName: \"kubernetes.io/projected/57f3ea1b-d23e-435c-826f-539c401753be-kube-api-access-x446k\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.387256 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.387312 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.387340 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.488691 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/57f3ea1b-d23e-435c-826f-539c401753be-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.488776 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.488839 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x446k\" (UniqueName: \"kubernetes.io/projected/57f3ea1b-d23e-435c-826f-539c401753be-kube-api-access-x446k\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.488876 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.488911 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.488939 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.488980 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.489019 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.489065 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.489097 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.489151 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.489610 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/57f3ea1b-d23e-435c-826f-539c401753be-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.495678 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.495726 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.496216 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.497378 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.497382 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.498728 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.498732 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.504935 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.510691 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.521619 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x446k\" (UniqueName: \"kubernetes.io/projected/57f3ea1b-d23e-435c-826f-539c401753be-kube-api-access-x446k\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4dhh\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:22 crc kubenswrapper[4696]: I0318 16:17:22.536494 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:17:23 crc kubenswrapper[4696]: I0318 16:17:23.038512 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh"] Mar 18 16:17:23 crc kubenswrapper[4696]: I0318 16:17:23.051498 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:17:23 crc kubenswrapper[4696]: I0318 16:17:23.137701 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" event={"ID":"57f3ea1b-d23e-435c-826f-539c401753be","Type":"ContainerStarted","Data":"d857ec9980d3cdd528a4865836d0a6627f27cf875f06506c783416e73f0404ae"} Mar 18 16:17:24 crc kubenswrapper[4696]: I0318 16:17:24.148472 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" event={"ID":"57f3ea1b-d23e-435c-826f-539c401753be","Type":"ContainerStarted","Data":"7e44ea926e094c951ecd68f8e1699b60c9138cf2ebbc4016de60891a0fe3c614"} Mar 18 16:17:24 crc kubenswrapper[4696]: I0318 16:17:24.187464 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" podStartSLOduration=2.029419738 podStartE2EDuration="2.187441399s" podCreationTimestamp="2026-03-18 16:17:22 +0000 UTC" firstStartedPulling="2026-03-18 16:17:23.051168899 +0000 UTC m=+2486.057343105" lastFinishedPulling="2026-03-18 16:17:23.20919056 +0000 UTC m=+2486.215364766" observedRunningTime="2026-03-18 16:17:24.172696188 +0000 UTC m=+2487.178870394" watchObservedRunningTime="2026-03-18 16:17:24.187441399 +0000 UTC m=+2487.193615625" Mar 18 16:17:25 crc kubenswrapper[4696]: I0318 16:17:25.598456 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:17:25 crc kubenswrapper[4696]: E0318 16:17:25.598969 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:17:40 crc kubenswrapper[4696]: I0318 16:17:40.598394 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:17:40 crc kubenswrapper[4696]: E0318 16:17:40.599352 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:17:55 crc kubenswrapper[4696]: I0318 16:17:55.598084 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:17:55 crc kubenswrapper[4696]: E0318 16:17:55.599047 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:18:00 crc kubenswrapper[4696]: I0318 16:18:00.150297 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564178-jsvx7"] Mar 18 16:18:00 crc kubenswrapper[4696]: I0318 16:18:00.152014 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564178-jsvx7" Mar 18 16:18:00 crc kubenswrapper[4696]: I0318 16:18:00.154037 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:18:00 crc kubenswrapper[4696]: I0318 16:18:00.154489 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:18:00 crc kubenswrapper[4696]: I0318 16:18:00.155050 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:18:00 crc kubenswrapper[4696]: I0318 16:18:00.158551 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564178-jsvx7"] Mar 18 16:18:00 crc kubenswrapper[4696]: I0318 16:18:00.266615 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s66g\" (UniqueName: \"kubernetes.io/projected/9f4d5b69-5589-4f08-a153-7e0ae2644a8b-kube-api-access-5s66g\") pod \"auto-csr-approver-29564178-jsvx7\" (UID: \"9f4d5b69-5589-4f08-a153-7e0ae2644a8b\") " pod="openshift-infra/auto-csr-approver-29564178-jsvx7" Mar 18 16:18:00 crc kubenswrapper[4696]: I0318 16:18:00.368434 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s66g\" (UniqueName: \"kubernetes.io/projected/9f4d5b69-5589-4f08-a153-7e0ae2644a8b-kube-api-access-5s66g\") pod \"auto-csr-approver-29564178-jsvx7\" (UID: \"9f4d5b69-5589-4f08-a153-7e0ae2644a8b\") " pod="openshift-infra/auto-csr-approver-29564178-jsvx7" Mar 18 16:18:00 crc kubenswrapper[4696]: I0318 16:18:00.389553 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s66g\" (UniqueName: \"kubernetes.io/projected/9f4d5b69-5589-4f08-a153-7e0ae2644a8b-kube-api-access-5s66g\") pod \"auto-csr-approver-29564178-jsvx7\" (UID: \"9f4d5b69-5589-4f08-a153-7e0ae2644a8b\") " pod="openshift-infra/auto-csr-approver-29564178-jsvx7" Mar 18 16:18:00 crc kubenswrapper[4696]: I0318 16:18:00.476156 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564178-jsvx7" Mar 18 16:18:00 crc kubenswrapper[4696]: I0318 16:18:00.912621 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564178-jsvx7"] Mar 18 16:18:00 crc kubenswrapper[4696]: W0318 16:18:00.916968 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f4d5b69_5589_4f08_a153_7e0ae2644a8b.slice/crio-a3269702a99b655693fb147f039677b10631a9e8bf36c9ef79fb2a80e6659bf1 WatchSource:0}: Error finding container a3269702a99b655693fb147f039677b10631a9e8bf36c9ef79fb2a80e6659bf1: Status 404 returned error can't find the container with id a3269702a99b655693fb147f039677b10631a9e8bf36c9ef79fb2a80e6659bf1 Mar 18 16:18:01 crc kubenswrapper[4696]: I0318 16:18:01.448009 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564178-jsvx7" event={"ID":"9f4d5b69-5589-4f08-a153-7e0ae2644a8b","Type":"ContainerStarted","Data":"a3269702a99b655693fb147f039677b10631a9e8bf36c9ef79fb2a80e6659bf1"} Mar 18 16:18:02 crc kubenswrapper[4696]: I0318 16:18:02.463403 4696 generic.go:334] "Generic (PLEG): container finished" podID="9f4d5b69-5589-4f08-a153-7e0ae2644a8b" containerID="6f94622a5a371de6fb44807053cce9daa30dafc1c62e0c05bb1150579fc27f44" exitCode=0 Mar 18 16:18:02 crc kubenswrapper[4696]: I0318 16:18:02.463560 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564178-jsvx7" event={"ID":"9f4d5b69-5589-4f08-a153-7e0ae2644a8b","Type":"ContainerDied","Data":"6f94622a5a371de6fb44807053cce9daa30dafc1c62e0c05bb1150579fc27f44"} Mar 18 16:18:03 crc kubenswrapper[4696]: I0318 16:18:03.807980 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564178-jsvx7" Mar 18 16:18:03 crc kubenswrapper[4696]: I0318 16:18:03.942709 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s66g\" (UniqueName: \"kubernetes.io/projected/9f4d5b69-5589-4f08-a153-7e0ae2644a8b-kube-api-access-5s66g\") pod \"9f4d5b69-5589-4f08-a153-7e0ae2644a8b\" (UID: \"9f4d5b69-5589-4f08-a153-7e0ae2644a8b\") " Mar 18 16:18:03 crc kubenswrapper[4696]: I0318 16:18:03.950855 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f4d5b69-5589-4f08-a153-7e0ae2644a8b-kube-api-access-5s66g" (OuterVolumeSpecName: "kube-api-access-5s66g") pod "9f4d5b69-5589-4f08-a153-7e0ae2644a8b" (UID: "9f4d5b69-5589-4f08-a153-7e0ae2644a8b"). InnerVolumeSpecName "kube-api-access-5s66g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:18:04 crc kubenswrapper[4696]: I0318 16:18:04.045456 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s66g\" (UniqueName: \"kubernetes.io/projected/9f4d5b69-5589-4f08-a153-7e0ae2644a8b-kube-api-access-5s66g\") on node \"crc\" DevicePath \"\"" Mar 18 16:18:04 crc kubenswrapper[4696]: I0318 16:18:04.487472 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564178-jsvx7" event={"ID":"9f4d5b69-5589-4f08-a153-7e0ae2644a8b","Type":"ContainerDied","Data":"a3269702a99b655693fb147f039677b10631a9e8bf36c9ef79fb2a80e6659bf1"} Mar 18 16:18:04 crc kubenswrapper[4696]: I0318 16:18:04.487762 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3269702a99b655693fb147f039677b10631a9e8bf36c9ef79fb2a80e6659bf1" Mar 18 16:18:04 crc kubenswrapper[4696]: I0318 16:18:04.487541 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564178-jsvx7" Mar 18 16:18:04 crc kubenswrapper[4696]: I0318 16:18:04.881874 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564172-w46vq"] Mar 18 16:18:04 crc kubenswrapper[4696]: I0318 16:18:04.892163 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564172-w46vq"] Mar 18 16:18:05 crc kubenswrapper[4696]: I0318 16:18:05.613037 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec181c54-af7d-4bda-aace-bac85ff76032" path="/var/lib/kubelet/pods/ec181c54-af7d-4bda-aace-bac85ff76032/volumes" Mar 18 16:18:10 crc kubenswrapper[4696]: I0318 16:18:10.597965 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:18:10 crc kubenswrapper[4696]: E0318 16:18:10.598590 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:18:23 crc kubenswrapper[4696]: I0318 16:18:23.255031 4696 scope.go:117] "RemoveContainer" containerID="e4cc71585d860b3d30e89080d0040f9fe626cb91f87f9e375272bcfd3c400bad" Mar 18 16:18:25 crc kubenswrapper[4696]: I0318 16:18:25.597252 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:18:25 crc kubenswrapper[4696]: E0318 16:18:25.597817 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:18:39 crc kubenswrapper[4696]: I0318 16:18:39.597620 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:18:39 crc kubenswrapper[4696]: E0318 16:18:39.598434 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:18:53 crc kubenswrapper[4696]: I0318 16:18:53.598715 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:18:53 crc kubenswrapper[4696]: E0318 16:18:53.600867 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:19:04 crc kubenswrapper[4696]: I0318 16:19:04.597820 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:19:04 crc kubenswrapper[4696]: E0318 16:19:04.598665 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:19:15 crc kubenswrapper[4696]: I0318 16:19:15.696595 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:19:15 crc kubenswrapper[4696]: E0318 16:19:15.698714 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:19:29 crc kubenswrapper[4696]: I0318 16:19:29.598311 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:19:29 crc kubenswrapper[4696]: E0318 16:19:29.599774 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:19:42 crc kubenswrapper[4696]: I0318 16:19:42.789631 4696 generic.go:334] "Generic (PLEG): container finished" podID="57f3ea1b-d23e-435c-826f-539c401753be" containerID="7e44ea926e094c951ecd68f8e1699b60c9138cf2ebbc4016de60891a0fe3c614" exitCode=0 Mar 18 16:19:42 crc kubenswrapper[4696]: I0318 16:19:42.789713 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" event={"ID":"57f3ea1b-d23e-435c-826f-539c401753be","Type":"ContainerDied","Data":"7e44ea926e094c951ecd68f8e1699b60c9138cf2ebbc4016de60891a0fe3c614"} Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.195467 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.290859 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-ssh-key-openstack-edpm-ipam\") pod \"57f3ea1b-d23e-435c-826f-539c401753be\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.290930 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x446k\" (UniqueName: \"kubernetes.io/projected/57f3ea1b-d23e-435c-826f-539c401753be-kube-api-access-x446k\") pod \"57f3ea1b-d23e-435c-826f-539c401753be\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.291011 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-cell1-compute-config-2\") pod \"57f3ea1b-d23e-435c-826f-539c401753be\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.291096 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-migration-ssh-key-1\") pod \"57f3ea1b-d23e-435c-826f-539c401753be\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.291127 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-inventory\") pod \"57f3ea1b-d23e-435c-826f-539c401753be\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.291175 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-migration-ssh-key-0\") pod \"57f3ea1b-d23e-435c-826f-539c401753be\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.291207 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-cell1-compute-config-3\") pod \"57f3ea1b-d23e-435c-826f-539c401753be\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.291289 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-cell1-compute-config-1\") pod \"57f3ea1b-d23e-435c-826f-539c401753be\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.291365 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/57f3ea1b-d23e-435c-826f-539c401753be-nova-extra-config-0\") pod \"57f3ea1b-d23e-435c-826f-539c401753be\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.291385 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-cell1-compute-config-0\") pod \"57f3ea1b-d23e-435c-826f-539c401753be\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.291456 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-combined-ca-bundle\") pod \"57f3ea1b-d23e-435c-826f-539c401753be\" (UID: \"57f3ea1b-d23e-435c-826f-539c401753be\") " Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.296601 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57f3ea1b-d23e-435c-826f-539c401753be-kube-api-access-x446k" (OuterVolumeSpecName: "kube-api-access-x446k") pod "57f3ea1b-d23e-435c-826f-539c401753be" (UID: "57f3ea1b-d23e-435c-826f-539c401753be"). InnerVolumeSpecName "kube-api-access-x446k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.297826 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "57f3ea1b-d23e-435c-826f-539c401753be" (UID: "57f3ea1b-d23e-435c-826f-539c401753be"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.318721 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "57f3ea1b-d23e-435c-826f-539c401753be" (UID: "57f3ea1b-d23e-435c-826f-539c401753be"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.319147 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "57f3ea1b-d23e-435c-826f-539c401753be" (UID: "57f3ea1b-d23e-435c-826f-539c401753be"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.322823 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "57f3ea1b-d23e-435c-826f-539c401753be" (UID: "57f3ea1b-d23e-435c-826f-539c401753be"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.328773 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "57f3ea1b-d23e-435c-826f-539c401753be" (UID: "57f3ea1b-d23e-435c-826f-539c401753be"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.330702 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "57f3ea1b-d23e-435c-826f-539c401753be" (UID: "57f3ea1b-d23e-435c-826f-539c401753be"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.334178 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "57f3ea1b-d23e-435c-826f-539c401753be" (UID: "57f3ea1b-d23e-435c-826f-539c401753be"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.335830 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-inventory" (OuterVolumeSpecName: "inventory") pod "57f3ea1b-d23e-435c-826f-539c401753be" (UID: "57f3ea1b-d23e-435c-826f-539c401753be"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.336241 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "57f3ea1b-d23e-435c-826f-539c401753be" (UID: "57f3ea1b-d23e-435c-826f-539c401753be"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.340227 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57f3ea1b-d23e-435c-826f-539c401753be-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "57f3ea1b-d23e-435c-826f-539c401753be" (UID: "57f3ea1b-d23e-435c-826f-539c401753be"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.393451 4696 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.393495 4696 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.393530 4696 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/57f3ea1b-d23e-435c-826f-539c401753be-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.393545 4696 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.393556 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.393569 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x446k\" (UniqueName: \"kubernetes.io/projected/57f3ea1b-d23e-435c-826f-539c401753be-kube-api-access-x446k\") on node \"crc\" DevicePath \"\"" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.393580 4696 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.393590 4696 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.393627 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.393639 4696 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.393651 4696 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/57f3ea1b-d23e-435c-826f-539c401753be-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.597904 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.812858 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerStarted","Data":"83e9cd26ce0fccbadcc5ae34fe4a8a285026485f53c72a41c064b2bd82e22018"} Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.815112 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" event={"ID":"57f3ea1b-d23e-435c-826f-539c401753be","Type":"ContainerDied","Data":"d857ec9980d3cdd528a4865836d0a6627f27cf875f06506c783416e73f0404ae"} Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.815265 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d857ec9980d3cdd528a4865836d0a6627f27cf875f06506c783416e73f0404ae" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.815152 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4dhh" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.942422 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc"] Mar 18 16:19:44 crc kubenswrapper[4696]: E0318 16:19:44.943077 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f3ea1b-d23e-435c-826f-539c401753be" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.943096 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f3ea1b-d23e-435c-826f-539c401753be" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 18 16:19:44 crc kubenswrapper[4696]: E0318 16:19:44.943108 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f4d5b69-5589-4f08-a153-7e0ae2644a8b" containerName="oc" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.943115 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f4d5b69-5589-4f08-a153-7e0ae2644a8b" containerName="oc" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.943333 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f4d5b69-5589-4f08-a153-7e0ae2644a8b" containerName="oc" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.943347 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="57f3ea1b-d23e-435c-826f-539c401753be" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.943965 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.946480 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.952516 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g8zsj" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.952780 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.952908 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.952779 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 18 16:19:44 crc kubenswrapper[4696]: I0318 16:19:44.959421 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc"] Mar 18 16:19:45 crc kubenswrapper[4696]: I0318 16:19:45.013399 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:19:45 crc kubenswrapper[4696]: I0318 16:19:45.013468 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:19:45 crc kubenswrapper[4696]: I0318 16:19:45.013489 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:19:45 crc kubenswrapper[4696]: I0318 16:19:45.013559 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:19:45 crc kubenswrapper[4696]: I0318 16:19:45.013600 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:19:45 crc kubenswrapper[4696]: I0318 16:19:45.013664 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmmgw\" (UniqueName: \"kubernetes.io/projected/9c5cf28b-0e58-48d1-bd91-2a403201c425-kube-api-access-wmmgw\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:19:45 crc kubenswrapper[4696]: I0318 16:19:45.013690 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:19:45 crc kubenswrapper[4696]: I0318 16:19:45.114995 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:19:45 crc kubenswrapper[4696]: I0318 16:19:45.115050 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:19:45 crc kubenswrapper[4696]: I0318 16:19:45.115071 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:19:45 crc kubenswrapper[4696]: I0318 16:19:45.115108 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:19:45 crc kubenswrapper[4696]: I0318 16:19:45.115149 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:19:45 crc kubenswrapper[4696]: I0318 16:19:45.115206 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmmgw\" (UniqueName: \"kubernetes.io/projected/9c5cf28b-0e58-48d1-bd91-2a403201c425-kube-api-access-wmmgw\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:19:45 crc kubenswrapper[4696]: I0318 16:19:45.115228 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:19:45 crc kubenswrapper[4696]: I0318 16:19:45.121175 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:19:45 crc kubenswrapper[4696]: I0318 16:19:45.121186 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:19:45 crc kubenswrapper[4696]: I0318 16:19:45.121415 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:19:45 crc kubenswrapper[4696]: I0318 16:19:45.123901 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:19:45 crc kubenswrapper[4696]: I0318 16:19:45.128188 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:19:45 crc kubenswrapper[4696]: I0318 16:19:45.132134 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:19:45 crc kubenswrapper[4696]: I0318 16:19:45.132875 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmmgw\" (UniqueName: \"kubernetes.io/projected/9c5cf28b-0e58-48d1-bd91-2a403201c425-kube-api-access-wmmgw\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:19:45 crc kubenswrapper[4696]: I0318 16:19:45.270040 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:19:45 crc kubenswrapper[4696]: I0318 16:19:45.768004 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc"] Mar 18 16:19:45 crc kubenswrapper[4696]: I0318 16:19:45.823183 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" event={"ID":"9c5cf28b-0e58-48d1-bd91-2a403201c425","Type":"ContainerStarted","Data":"617f9e6b207f61f7e1225ed0cd61570812a30c68c6e186b0ecb08fa1aee02191"} Mar 18 16:19:46 crc kubenswrapper[4696]: I0318 16:19:46.831410 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" event={"ID":"9c5cf28b-0e58-48d1-bd91-2a403201c425","Type":"ContainerStarted","Data":"ff6228584ee1b330d4668ab59866450ba97094e4e7e12d4268645bb99e7f853e"} Mar 18 16:19:46 crc kubenswrapper[4696]: I0318 16:19:46.848700 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" podStartSLOduration=2.6806736129999997 podStartE2EDuration="2.848671977s" podCreationTimestamp="2026-03-18 16:19:44 +0000 UTC" firstStartedPulling="2026-03-18 16:19:45.76672821 +0000 UTC m=+2628.772902436" lastFinishedPulling="2026-03-18 16:19:45.934726594 +0000 UTC m=+2628.940900800" observedRunningTime="2026-03-18 16:19:46.845084157 +0000 UTC m=+2629.851258403" watchObservedRunningTime="2026-03-18 16:19:46.848671977 +0000 UTC m=+2629.854846203" Mar 18 16:20:00 crc kubenswrapper[4696]: I0318 16:20:00.134239 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564180-h964z"] Mar 18 16:20:00 crc kubenswrapper[4696]: I0318 16:20:00.136396 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564180-h964z" Mar 18 16:20:00 crc kubenswrapper[4696]: I0318 16:20:00.139231 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:20:00 crc kubenswrapper[4696]: I0318 16:20:00.139545 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:20:00 crc kubenswrapper[4696]: I0318 16:20:00.141351 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:20:00 crc kubenswrapper[4696]: I0318 16:20:00.145897 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564180-h964z"] Mar 18 16:20:00 crc kubenswrapper[4696]: I0318 16:20:00.292964 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l9lk\" (UniqueName: \"kubernetes.io/projected/d33cc811-e3e0-4900-b5fd-579e91e48191-kube-api-access-6l9lk\") pod \"auto-csr-approver-29564180-h964z\" (UID: \"d33cc811-e3e0-4900-b5fd-579e91e48191\") " pod="openshift-infra/auto-csr-approver-29564180-h964z" Mar 18 16:20:00 crc kubenswrapper[4696]: I0318 16:20:00.395508 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l9lk\" (UniqueName: \"kubernetes.io/projected/d33cc811-e3e0-4900-b5fd-579e91e48191-kube-api-access-6l9lk\") pod \"auto-csr-approver-29564180-h964z\" (UID: \"d33cc811-e3e0-4900-b5fd-579e91e48191\") " pod="openshift-infra/auto-csr-approver-29564180-h964z" Mar 18 16:20:00 crc kubenswrapper[4696]: I0318 16:20:00.418575 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l9lk\" (UniqueName: \"kubernetes.io/projected/d33cc811-e3e0-4900-b5fd-579e91e48191-kube-api-access-6l9lk\") pod \"auto-csr-approver-29564180-h964z\" (UID: \"d33cc811-e3e0-4900-b5fd-579e91e48191\") " pod="openshift-infra/auto-csr-approver-29564180-h964z" Mar 18 16:20:00 crc kubenswrapper[4696]: I0318 16:20:00.457060 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564180-h964z" Mar 18 16:20:00 crc kubenswrapper[4696]: I0318 16:20:00.893407 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564180-h964z"] Mar 18 16:20:00 crc kubenswrapper[4696]: W0318 16:20:00.893937 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd33cc811_e3e0_4900_b5fd_579e91e48191.slice/crio-624b599bee0c084f46496d392fb6b6e87222f2a1a0ea3b795eec97a36585f6b5 WatchSource:0}: Error finding container 624b599bee0c084f46496d392fb6b6e87222f2a1a0ea3b795eec97a36585f6b5: Status 404 returned error can't find the container with id 624b599bee0c084f46496d392fb6b6e87222f2a1a0ea3b795eec97a36585f6b5 Mar 18 16:20:00 crc kubenswrapper[4696]: I0318 16:20:00.951390 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564180-h964z" event={"ID":"d33cc811-e3e0-4900-b5fd-579e91e48191","Type":"ContainerStarted","Data":"624b599bee0c084f46496d392fb6b6e87222f2a1a0ea3b795eec97a36585f6b5"} Mar 18 16:20:02 crc kubenswrapper[4696]: I0318 16:20:02.966846 4696 generic.go:334] "Generic (PLEG): container finished" podID="d33cc811-e3e0-4900-b5fd-579e91e48191" containerID="c28386e8ee0ab605c159b0bce181c7962f6cfcefc6db939644831dde1693159b" exitCode=0 Mar 18 16:20:02 crc kubenswrapper[4696]: I0318 16:20:02.966936 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564180-h964z" event={"ID":"d33cc811-e3e0-4900-b5fd-579e91e48191","Type":"ContainerDied","Data":"c28386e8ee0ab605c159b0bce181c7962f6cfcefc6db939644831dde1693159b"} Mar 18 16:20:03 crc kubenswrapper[4696]: I0318 16:20:03.285861 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-27fx2"] Mar 18 16:20:03 crc kubenswrapper[4696]: I0318 16:20:03.288235 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27fx2" Mar 18 16:20:03 crc kubenswrapper[4696]: I0318 16:20:03.355326 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58e0351c-3eaa-49ee-8792-1bff64a37d0b-catalog-content\") pod \"redhat-marketplace-27fx2\" (UID: \"58e0351c-3eaa-49ee-8792-1bff64a37d0b\") " pod="openshift-marketplace/redhat-marketplace-27fx2" Mar 18 16:20:03 crc kubenswrapper[4696]: I0318 16:20:03.355382 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2krfx\" (UniqueName: \"kubernetes.io/projected/58e0351c-3eaa-49ee-8792-1bff64a37d0b-kube-api-access-2krfx\") pod \"redhat-marketplace-27fx2\" (UID: \"58e0351c-3eaa-49ee-8792-1bff64a37d0b\") " pod="openshift-marketplace/redhat-marketplace-27fx2" Mar 18 16:20:03 crc kubenswrapper[4696]: I0318 16:20:03.355467 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58e0351c-3eaa-49ee-8792-1bff64a37d0b-utilities\") pod \"redhat-marketplace-27fx2\" (UID: \"58e0351c-3eaa-49ee-8792-1bff64a37d0b\") " pod="openshift-marketplace/redhat-marketplace-27fx2" Mar 18 16:20:03 crc kubenswrapper[4696]: I0318 16:20:03.365590 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-27fx2"] Mar 18 16:20:03 crc kubenswrapper[4696]: I0318 16:20:03.456777 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58e0351c-3eaa-49ee-8792-1bff64a37d0b-utilities\") pod \"redhat-marketplace-27fx2\" (UID: \"58e0351c-3eaa-49ee-8792-1bff64a37d0b\") " pod="openshift-marketplace/redhat-marketplace-27fx2" Mar 18 16:20:03 crc kubenswrapper[4696]: I0318 16:20:03.456874 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58e0351c-3eaa-49ee-8792-1bff64a37d0b-catalog-content\") pod \"redhat-marketplace-27fx2\" (UID: \"58e0351c-3eaa-49ee-8792-1bff64a37d0b\") " pod="openshift-marketplace/redhat-marketplace-27fx2" Mar 18 16:20:03 crc kubenswrapper[4696]: I0318 16:20:03.456925 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2krfx\" (UniqueName: \"kubernetes.io/projected/58e0351c-3eaa-49ee-8792-1bff64a37d0b-kube-api-access-2krfx\") pod \"redhat-marketplace-27fx2\" (UID: \"58e0351c-3eaa-49ee-8792-1bff64a37d0b\") " pod="openshift-marketplace/redhat-marketplace-27fx2" Mar 18 16:20:03 crc kubenswrapper[4696]: I0318 16:20:03.457228 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58e0351c-3eaa-49ee-8792-1bff64a37d0b-utilities\") pod \"redhat-marketplace-27fx2\" (UID: \"58e0351c-3eaa-49ee-8792-1bff64a37d0b\") " pod="openshift-marketplace/redhat-marketplace-27fx2" Mar 18 16:20:03 crc kubenswrapper[4696]: I0318 16:20:03.457375 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58e0351c-3eaa-49ee-8792-1bff64a37d0b-catalog-content\") pod \"redhat-marketplace-27fx2\" (UID: \"58e0351c-3eaa-49ee-8792-1bff64a37d0b\") " pod="openshift-marketplace/redhat-marketplace-27fx2" Mar 18 16:20:03 crc kubenswrapper[4696]: I0318 16:20:03.479828 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2krfx\" (UniqueName: \"kubernetes.io/projected/58e0351c-3eaa-49ee-8792-1bff64a37d0b-kube-api-access-2krfx\") pod \"redhat-marketplace-27fx2\" (UID: \"58e0351c-3eaa-49ee-8792-1bff64a37d0b\") " pod="openshift-marketplace/redhat-marketplace-27fx2" Mar 18 16:20:03 crc kubenswrapper[4696]: I0318 16:20:03.608690 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27fx2" Mar 18 16:20:04 crc kubenswrapper[4696]: I0318 16:20:04.141751 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-27fx2"] Mar 18 16:20:04 crc kubenswrapper[4696]: W0318 16:20:04.147301 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58e0351c_3eaa_49ee_8792_1bff64a37d0b.slice/crio-953b5b5f9b68bb776b358e05f0969543a0846895b9d684f86c086b020a8c155a WatchSource:0}: Error finding container 953b5b5f9b68bb776b358e05f0969543a0846895b9d684f86c086b020a8c155a: Status 404 returned error can't find the container with id 953b5b5f9b68bb776b358e05f0969543a0846895b9d684f86c086b020a8c155a Mar 18 16:20:04 crc kubenswrapper[4696]: I0318 16:20:04.842097 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564180-h964z" Mar 18 16:20:04 crc kubenswrapper[4696]: I0318 16:20:04.892789 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l9lk\" (UniqueName: \"kubernetes.io/projected/d33cc811-e3e0-4900-b5fd-579e91e48191-kube-api-access-6l9lk\") pod \"d33cc811-e3e0-4900-b5fd-579e91e48191\" (UID: \"d33cc811-e3e0-4900-b5fd-579e91e48191\") " Mar 18 16:20:04 crc kubenswrapper[4696]: I0318 16:20:04.904786 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d33cc811-e3e0-4900-b5fd-579e91e48191-kube-api-access-6l9lk" (OuterVolumeSpecName: "kube-api-access-6l9lk") pod "d33cc811-e3e0-4900-b5fd-579e91e48191" (UID: "d33cc811-e3e0-4900-b5fd-579e91e48191"). InnerVolumeSpecName "kube-api-access-6l9lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:20:04 crc kubenswrapper[4696]: I0318 16:20:04.988840 4696 generic.go:334] "Generic (PLEG): container finished" podID="58e0351c-3eaa-49ee-8792-1bff64a37d0b" containerID="12e3130fbf36c7b0958058e0376284c0c0a2cf5b92a81efb067b9b0d4aec55ab" exitCode=0 Mar 18 16:20:04 crc kubenswrapper[4696]: I0318 16:20:04.988917 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27fx2" event={"ID":"58e0351c-3eaa-49ee-8792-1bff64a37d0b","Type":"ContainerDied","Data":"12e3130fbf36c7b0958058e0376284c0c0a2cf5b92a81efb067b9b0d4aec55ab"} Mar 18 16:20:04 crc kubenswrapper[4696]: I0318 16:20:04.988958 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27fx2" event={"ID":"58e0351c-3eaa-49ee-8792-1bff64a37d0b","Type":"ContainerStarted","Data":"953b5b5f9b68bb776b358e05f0969543a0846895b9d684f86c086b020a8c155a"} Mar 18 16:20:04 crc kubenswrapper[4696]: I0318 16:20:04.990797 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564180-h964z" event={"ID":"d33cc811-e3e0-4900-b5fd-579e91e48191","Type":"ContainerDied","Data":"624b599bee0c084f46496d392fb6b6e87222f2a1a0ea3b795eec97a36585f6b5"} Mar 18 16:20:04 crc kubenswrapper[4696]: I0318 16:20:04.990912 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="624b599bee0c084f46496d392fb6b6e87222f2a1a0ea3b795eec97a36585f6b5" Mar 18 16:20:04 crc kubenswrapper[4696]: I0318 16:20:04.990872 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564180-h964z" Mar 18 16:20:04 crc kubenswrapper[4696]: I0318 16:20:04.995492 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6l9lk\" (UniqueName: \"kubernetes.io/projected/d33cc811-e3e0-4900-b5fd-579e91e48191-kube-api-access-6l9lk\") on node \"crc\" DevicePath \"\"" Mar 18 16:20:05 crc kubenswrapper[4696]: I0318 16:20:05.907980 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564174-jf5lv"] Mar 18 16:20:05 crc kubenswrapper[4696]: I0318 16:20:05.920943 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564174-jf5lv"] Mar 18 16:20:06 crc kubenswrapper[4696]: I0318 16:20:06.000686 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27fx2" event={"ID":"58e0351c-3eaa-49ee-8792-1bff64a37d0b","Type":"ContainerStarted","Data":"3c6edcadef0208682a472bc450aad09003f472e0549f6e36e51137fd69d68350"} Mar 18 16:20:07 crc kubenswrapper[4696]: I0318 16:20:07.009576 4696 generic.go:334] "Generic (PLEG): container finished" podID="58e0351c-3eaa-49ee-8792-1bff64a37d0b" containerID="3c6edcadef0208682a472bc450aad09003f472e0549f6e36e51137fd69d68350" exitCode=0 Mar 18 16:20:07 crc kubenswrapper[4696]: I0318 16:20:07.009609 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27fx2" event={"ID":"58e0351c-3eaa-49ee-8792-1bff64a37d0b","Type":"ContainerDied","Data":"3c6edcadef0208682a472bc450aad09003f472e0549f6e36e51137fd69d68350"} Mar 18 16:20:07 crc kubenswrapper[4696]: I0318 16:20:07.610879 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a44086b8-ccf9-4372-8350-f37c793dcc13" path="/var/lib/kubelet/pods/a44086b8-ccf9-4372-8350-f37c793dcc13/volumes" Mar 18 16:20:08 crc kubenswrapper[4696]: I0318 16:20:08.022390 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27fx2" event={"ID":"58e0351c-3eaa-49ee-8792-1bff64a37d0b","Type":"ContainerStarted","Data":"b50188200d9922ddbbd941f1202de304404c7474591eeec61739f6984e068132"} Mar 18 16:20:08 crc kubenswrapper[4696]: I0318 16:20:08.049951 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-27fx2" podStartSLOduration=2.517526729 podStartE2EDuration="5.04993187s" podCreationTimestamp="2026-03-18 16:20:03 +0000 UTC" firstStartedPulling="2026-03-18 16:20:04.991505411 +0000 UTC m=+2647.997679627" lastFinishedPulling="2026-03-18 16:20:07.523910562 +0000 UTC m=+2650.530084768" observedRunningTime="2026-03-18 16:20:08.039540549 +0000 UTC m=+2651.045714765" watchObservedRunningTime="2026-03-18 16:20:08.04993187 +0000 UTC m=+2651.056106076" Mar 18 16:20:13 crc kubenswrapper[4696]: I0318 16:20:13.611054 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-27fx2" Mar 18 16:20:13 crc kubenswrapper[4696]: I0318 16:20:13.611395 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-27fx2" Mar 18 16:20:13 crc kubenswrapper[4696]: I0318 16:20:13.653677 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-27fx2" Mar 18 16:20:14 crc kubenswrapper[4696]: I0318 16:20:14.135105 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-27fx2" Mar 18 16:20:14 crc kubenswrapper[4696]: I0318 16:20:14.179953 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-27fx2"] Mar 18 16:20:16 crc kubenswrapper[4696]: I0318 16:20:16.093155 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-27fx2" podUID="58e0351c-3eaa-49ee-8792-1bff64a37d0b" containerName="registry-server" containerID="cri-o://b50188200d9922ddbbd941f1202de304404c7474591eeec61739f6984e068132" gracePeriod=2 Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.065684 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27fx2" Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.105872 4696 generic.go:334] "Generic (PLEG): container finished" podID="58e0351c-3eaa-49ee-8792-1bff64a37d0b" containerID="b50188200d9922ddbbd941f1202de304404c7474591eeec61739f6984e068132" exitCode=0 Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.105915 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27fx2" event={"ID":"58e0351c-3eaa-49ee-8792-1bff64a37d0b","Type":"ContainerDied","Data":"b50188200d9922ddbbd941f1202de304404c7474591eeec61739f6984e068132"} Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.105940 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27fx2" event={"ID":"58e0351c-3eaa-49ee-8792-1bff64a37d0b","Type":"ContainerDied","Data":"953b5b5f9b68bb776b358e05f0969543a0846895b9d684f86c086b020a8c155a"} Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.105956 4696 scope.go:117] "RemoveContainer" containerID="b50188200d9922ddbbd941f1202de304404c7474591eeec61739f6984e068132" Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.106077 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27fx2" Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.128475 4696 scope.go:117] "RemoveContainer" containerID="3c6edcadef0208682a472bc450aad09003f472e0549f6e36e51137fd69d68350" Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.150858 4696 scope.go:117] "RemoveContainer" containerID="12e3130fbf36c7b0958058e0376284c0c0a2cf5b92a81efb067b9b0d4aec55ab" Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.203723 4696 scope.go:117] "RemoveContainer" containerID="b50188200d9922ddbbd941f1202de304404c7474591eeec61739f6984e068132" Mar 18 16:20:17 crc kubenswrapper[4696]: E0318 16:20:17.204154 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b50188200d9922ddbbd941f1202de304404c7474591eeec61739f6984e068132\": container with ID starting with b50188200d9922ddbbd941f1202de304404c7474591eeec61739f6984e068132 not found: ID does not exist" containerID="b50188200d9922ddbbd941f1202de304404c7474591eeec61739f6984e068132" Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.204181 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b50188200d9922ddbbd941f1202de304404c7474591eeec61739f6984e068132"} err="failed to get container status \"b50188200d9922ddbbd941f1202de304404c7474591eeec61739f6984e068132\": rpc error: code = NotFound desc = could not find container \"b50188200d9922ddbbd941f1202de304404c7474591eeec61739f6984e068132\": container with ID starting with b50188200d9922ddbbd941f1202de304404c7474591eeec61739f6984e068132 not found: ID does not exist" Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.204199 4696 scope.go:117] "RemoveContainer" containerID="3c6edcadef0208682a472bc450aad09003f472e0549f6e36e51137fd69d68350" Mar 18 16:20:17 crc kubenswrapper[4696]: E0318 16:20:17.204442 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c6edcadef0208682a472bc450aad09003f472e0549f6e36e51137fd69d68350\": container with ID starting with 3c6edcadef0208682a472bc450aad09003f472e0549f6e36e51137fd69d68350 not found: ID does not exist" containerID="3c6edcadef0208682a472bc450aad09003f472e0549f6e36e51137fd69d68350" Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.204460 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c6edcadef0208682a472bc450aad09003f472e0549f6e36e51137fd69d68350"} err="failed to get container status \"3c6edcadef0208682a472bc450aad09003f472e0549f6e36e51137fd69d68350\": rpc error: code = NotFound desc = could not find container \"3c6edcadef0208682a472bc450aad09003f472e0549f6e36e51137fd69d68350\": container with ID starting with 3c6edcadef0208682a472bc450aad09003f472e0549f6e36e51137fd69d68350 not found: ID does not exist" Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.204474 4696 scope.go:117] "RemoveContainer" containerID="12e3130fbf36c7b0958058e0376284c0c0a2cf5b92a81efb067b9b0d4aec55ab" Mar 18 16:20:17 crc kubenswrapper[4696]: E0318 16:20:17.204693 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12e3130fbf36c7b0958058e0376284c0c0a2cf5b92a81efb067b9b0d4aec55ab\": container with ID starting with 12e3130fbf36c7b0958058e0376284c0c0a2cf5b92a81efb067b9b0d4aec55ab not found: ID does not exist" containerID="12e3130fbf36c7b0958058e0376284c0c0a2cf5b92a81efb067b9b0d4aec55ab" Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.204714 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e3130fbf36c7b0958058e0376284c0c0a2cf5b92a81efb067b9b0d4aec55ab"} err="failed to get container status \"12e3130fbf36c7b0958058e0376284c0c0a2cf5b92a81efb067b9b0d4aec55ab\": rpc error: code = NotFound desc = could not find container \"12e3130fbf36c7b0958058e0376284c0c0a2cf5b92a81efb067b9b0d4aec55ab\": container with ID starting with 12e3130fbf36c7b0958058e0376284c0c0a2cf5b92a81efb067b9b0d4aec55ab not found: ID does not exist" Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.249605 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58e0351c-3eaa-49ee-8792-1bff64a37d0b-utilities\") pod \"58e0351c-3eaa-49ee-8792-1bff64a37d0b\" (UID: \"58e0351c-3eaa-49ee-8792-1bff64a37d0b\") " Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.249848 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58e0351c-3eaa-49ee-8792-1bff64a37d0b-catalog-content\") pod \"58e0351c-3eaa-49ee-8792-1bff64a37d0b\" (UID: \"58e0351c-3eaa-49ee-8792-1bff64a37d0b\") " Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.249975 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2krfx\" (UniqueName: \"kubernetes.io/projected/58e0351c-3eaa-49ee-8792-1bff64a37d0b-kube-api-access-2krfx\") pod \"58e0351c-3eaa-49ee-8792-1bff64a37d0b\" (UID: \"58e0351c-3eaa-49ee-8792-1bff64a37d0b\") " Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.252030 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58e0351c-3eaa-49ee-8792-1bff64a37d0b-utilities" (OuterVolumeSpecName: "utilities") pod "58e0351c-3eaa-49ee-8792-1bff64a37d0b" (UID: "58e0351c-3eaa-49ee-8792-1bff64a37d0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.255838 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58e0351c-3eaa-49ee-8792-1bff64a37d0b-kube-api-access-2krfx" (OuterVolumeSpecName: "kube-api-access-2krfx") pod "58e0351c-3eaa-49ee-8792-1bff64a37d0b" (UID: "58e0351c-3eaa-49ee-8792-1bff64a37d0b"). InnerVolumeSpecName "kube-api-access-2krfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.275198 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58e0351c-3eaa-49ee-8792-1bff64a37d0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58e0351c-3eaa-49ee-8792-1bff64a37d0b" (UID: "58e0351c-3eaa-49ee-8792-1bff64a37d0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.352221 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58e0351c-3eaa-49ee-8792-1bff64a37d0b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.352253 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2krfx\" (UniqueName: \"kubernetes.io/projected/58e0351c-3eaa-49ee-8792-1bff64a37d0b-kube-api-access-2krfx\") on node \"crc\" DevicePath \"\"" Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.352264 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58e0351c-3eaa-49ee-8792-1bff64a37d0b-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.445612 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-27fx2"] Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.457052 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-27fx2"] Mar 18 16:20:17 crc kubenswrapper[4696]: I0318 16:20:17.607444 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58e0351c-3eaa-49ee-8792-1bff64a37d0b" path="/var/lib/kubelet/pods/58e0351c-3eaa-49ee-8792-1bff64a37d0b/volumes" Mar 18 16:20:23 crc kubenswrapper[4696]: I0318 16:20:23.386960 4696 scope.go:117] "RemoveContainer" containerID="4d495557875b3aa4b741621151db02aa7f44308c38106614a57a9c60c788c29d" Mar 18 16:20:41 crc kubenswrapper[4696]: I0318 16:20:41.686627 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sn9br"] Mar 18 16:20:41 crc kubenswrapper[4696]: E0318 16:20:41.692093 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58e0351c-3eaa-49ee-8792-1bff64a37d0b" containerName="registry-server" Mar 18 16:20:41 crc kubenswrapper[4696]: I0318 16:20:41.692140 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="58e0351c-3eaa-49ee-8792-1bff64a37d0b" containerName="registry-server" Mar 18 16:20:41 crc kubenswrapper[4696]: E0318 16:20:41.692164 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d33cc811-e3e0-4900-b5fd-579e91e48191" containerName="oc" Mar 18 16:20:41 crc kubenswrapper[4696]: I0318 16:20:41.692173 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="d33cc811-e3e0-4900-b5fd-579e91e48191" containerName="oc" Mar 18 16:20:41 crc kubenswrapper[4696]: E0318 16:20:41.692182 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58e0351c-3eaa-49ee-8792-1bff64a37d0b" containerName="extract-utilities" Mar 18 16:20:41 crc kubenswrapper[4696]: I0318 16:20:41.692189 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="58e0351c-3eaa-49ee-8792-1bff64a37d0b" containerName="extract-utilities" Mar 18 16:20:41 crc kubenswrapper[4696]: E0318 16:20:41.692203 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58e0351c-3eaa-49ee-8792-1bff64a37d0b" containerName="extract-content" Mar 18 16:20:41 crc kubenswrapper[4696]: I0318 16:20:41.692210 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="58e0351c-3eaa-49ee-8792-1bff64a37d0b" containerName="extract-content" Mar 18 16:20:41 crc kubenswrapper[4696]: I0318 16:20:41.692439 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="58e0351c-3eaa-49ee-8792-1bff64a37d0b" containerName="registry-server" Mar 18 16:20:41 crc kubenswrapper[4696]: I0318 16:20:41.692479 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="d33cc811-e3e0-4900-b5fd-579e91e48191" containerName="oc" Mar 18 16:20:41 crc kubenswrapper[4696]: I0318 16:20:41.693875 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sn9br" Mar 18 16:20:41 crc kubenswrapper[4696]: I0318 16:20:41.714485 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sn9br"] Mar 18 16:20:41 crc kubenswrapper[4696]: I0318 16:20:41.721708 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/318897ee-25bb-4784-ab4a-a09877ea0922-utilities\") pod \"redhat-operators-sn9br\" (UID: \"318897ee-25bb-4784-ab4a-a09877ea0922\") " pod="openshift-marketplace/redhat-operators-sn9br" Mar 18 16:20:41 crc kubenswrapper[4696]: I0318 16:20:41.722073 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7tns\" (UniqueName: \"kubernetes.io/projected/318897ee-25bb-4784-ab4a-a09877ea0922-kube-api-access-r7tns\") pod \"redhat-operators-sn9br\" (UID: \"318897ee-25bb-4784-ab4a-a09877ea0922\") " pod="openshift-marketplace/redhat-operators-sn9br" Mar 18 16:20:41 crc kubenswrapper[4696]: I0318 16:20:41.722136 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/318897ee-25bb-4784-ab4a-a09877ea0922-catalog-content\") pod \"redhat-operators-sn9br\" (UID: \"318897ee-25bb-4784-ab4a-a09877ea0922\") " pod="openshift-marketplace/redhat-operators-sn9br" Mar 18 16:20:41 crc kubenswrapper[4696]: I0318 16:20:41.823391 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/318897ee-25bb-4784-ab4a-a09877ea0922-utilities\") pod \"redhat-operators-sn9br\" (UID: \"318897ee-25bb-4784-ab4a-a09877ea0922\") " pod="openshift-marketplace/redhat-operators-sn9br" Mar 18 16:20:41 crc kubenswrapper[4696]: I0318 16:20:41.823461 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7tns\" (UniqueName: \"kubernetes.io/projected/318897ee-25bb-4784-ab4a-a09877ea0922-kube-api-access-r7tns\") pod \"redhat-operators-sn9br\" (UID: \"318897ee-25bb-4784-ab4a-a09877ea0922\") " pod="openshift-marketplace/redhat-operators-sn9br" Mar 18 16:20:41 crc kubenswrapper[4696]: I0318 16:20:41.823499 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/318897ee-25bb-4784-ab4a-a09877ea0922-catalog-content\") pod \"redhat-operators-sn9br\" (UID: \"318897ee-25bb-4784-ab4a-a09877ea0922\") " pod="openshift-marketplace/redhat-operators-sn9br" Mar 18 16:20:41 crc kubenswrapper[4696]: I0318 16:20:41.824307 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/318897ee-25bb-4784-ab4a-a09877ea0922-utilities\") pod \"redhat-operators-sn9br\" (UID: \"318897ee-25bb-4784-ab4a-a09877ea0922\") " pod="openshift-marketplace/redhat-operators-sn9br" Mar 18 16:20:41 crc kubenswrapper[4696]: I0318 16:20:41.826025 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/318897ee-25bb-4784-ab4a-a09877ea0922-catalog-content\") pod \"redhat-operators-sn9br\" (UID: \"318897ee-25bb-4784-ab4a-a09877ea0922\") " pod="openshift-marketplace/redhat-operators-sn9br" Mar 18 16:20:41 crc kubenswrapper[4696]: I0318 16:20:41.853873 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7tns\" (UniqueName: \"kubernetes.io/projected/318897ee-25bb-4784-ab4a-a09877ea0922-kube-api-access-r7tns\") pod \"redhat-operators-sn9br\" (UID: \"318897ee-25bb-4784-ab4a-a09877ea0922\") " pod="openshift-marketplace/redhat-operators-sn9br" Mar 18 16:20:42 crc kubenswrapper[4696]: I0318 16:20:42.025108 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sn9br" Mar 18 16:20:42 crc kubenswrapper[4696]: I0318 16:20:42.480637 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sn9br"] Mar 18 16:20:43 crc kubenswrapper[4696]: I0318 16:20:43.331704 4696 generic.go:334] "Generic (PLEG): container finished" podID="318897ee-25bb-4784-ab4a-a09877ea0922" containerID="3f323b8d3626df1b761d989eaf7866aea12dd5bb6e01318c608bacabb71ae046" exitCode=0 Mar 18 16:20:43 crc kubenswrapper[4696]: I0318 16:20:43.331965 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn9br" event={"ID":"318897ee-25bb-4784-ab4a-a09877ea0922","Type":"ContainerDied","Data":"3f323b8d3626df1b761d989eaf7866aea12dd5bb6e01318c608bacabb71ae046"} Mar 18 16:20:43 crc kubenswrapper[4696]: I0318 16:20:43.331990 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn9br" event={"ID":"318897ee-25bb-4784-ab4a-a09877ea0922","Type":"ContainerStarted","Data":"79bf6ef232a4afe65a5ffe664907cbd9ab23b7b5af433c7a5b6d535957445cb8"} Mar 18 16:20:58 crc kubenswrapper[4696]: I0318 16:20:58.468367 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn9br" event={"ID":"318897ee-25bb-4784-ab4a-a09877ea0922","Type":"ContainerStarted","Data":"99f81704dad8895dd2ab79226f57b9f1f4ead97f074cccb34122a29a5059406c"} Mar 18 16:21:01 crc kubenswrapper[4696]: I0318 16:21:01.495932 4696 generic.go:334] "Generic (PLEG): container finished" podID="318897ee-25bb-4784-ab4a-a09877ea0922" containerID="99f81704dad8895dd2ab79226f57b9f1f4ead97f074cccb34122a29a5059406c" exitCode=0 Mar 18 16:21:01 crc kubenswrapper[4696]: I0318 16:21:01.496458 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn9br" event={"ID":"318897ee-25bb-4784-ab4a-a09877ea0922","Type":"ContainerDied","Data":"99f81704dad8895dd2ab79226f57b9f1f4ead97f074cccb34122a29a5059406c"} Mar 18 16:21:02 crc kubenswrapper[4696]: I0318 16:21:02.510736 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn9br" event={"ID":"318897ee-25bb-4784-ab4a-a09877ea0922","Type":"ContainerStarted","Data":"c0620558b993c6d7c4bfecdd4c009f34fcf85f5c8eacf94ac2dda90eb884f6d3"} Mar 18 16:21:12 crc kubenswrapper[4696]: I0318 16:21:12.025909 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sn9br" Mar 18 16:21:12 crc kubenswrapper[4696]: I0318 16:21:12.026466 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sn9br" Mar 18 16:21:13 crc kubenswrapper[4696]: I0318 16:21:13.072192 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sn9br" podUID="318897ee-25bb-4784-ab4a-a09877ea0922" containerName="registry-server" probeResult="failure" output=< Mar 18 16:21:13 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Mar 18 16:21:13 crc kubenswrapper[4696]: > Mar 18 16:21:22 crc kubenswrapper[4696]: I0318 16:21:22.074433 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sn9br" Mar 18 16:21:22 crc kubenswrapper[4696]: I0318 16:21:22.126098 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sn9br" podStartSLOduration=22.531110267 podStartE2EDuration="41.1260697s" podCreationTimestamp="2026-03-18 16:20:41 +0000 UTC" firstStartedPulling="2026-03-18 16:20:43.33360339 +0000 UTC m=+2686.339777596" lastFinishedPulling="2026-03-18 16:21:01.928562823 +0000 UTC m=+2704.934737029" observedRunningTime="2026-03-18 16:21:02.53529061 +0000 UTC m=+2705.541464826" watchObservedRunningTime="2026-03-18 16:21:22.1260697 +0000 UTC m=+2725.132243906" Mar 18 16:21:22 crc kubenswrapper[4696]: I0318 16:21:22.135390 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sn9br" Mar 18 16:21:22 crc kubenswrapper[4696]: I0318 16:21:22.199866 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sn9br"] Mar 18 16:21:22 crc kubenswrapper[4696]: I0318 16:21:22.389360 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gskmp"] Mar 18 16:21:22 crc kubenswrapper[4696]: I0318 16:21:22.389607 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gskmp" podUID="ee9f444d-f1ec-493e-bd45-8ac0c395dd9d" containerName="registry-server" containerID="cri-o://945fb57c284158d29a55c61cedee33484bfb1a70ba6d2799581a68133b6ee79e" gracePeriod=2 Mar 18 16:21:22 crc kubenswrapper[4696]: I0318 16:21:22.717110 4696 generic.go:334] "Generic (PLEG): container finished" podID="ee9f444d-f1ec-493e-bd45-8ac0c395dd9d" containerID="945fb57c284158d29a55c61cedee33484bfb1a70ba6d2799581a68133b6ee79e" exitCode=0 Mar 18 16:21:22 crc kubenswrapper[4696]: I0318 16:21:22.717192 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gskmp" event={"ID":"ee9f444d-f1ec-493e-bd45-8ac0c395dd9d","Type":"ContainerDied","Data":"945fb57c284158d29a55c61cedee33484bfb1a70ba6d2799581a68133b6ee79e"} Mar 18 16:21:22 crc kubenswrapper[4696]: I0318 16:21:22.826759 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gskmp" Mar 18 16:21:22 crc kubenswrapper[4696]: I0318 16:21:22.946403 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee9f444d-f1ec-493e-bd45-8ac0c395dd9d-catalog-content\") pod \"ee9f444d-f1ec-493e-bd45-8ac0c395dd9d\" (UID: \"ee9f444d-f1ec-493e-bd45-8ac0c395dd9d\") " Mar 18 16:21:22 crc kubenswrapper[4696]: I0318 16:21:22.946538 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wqvq\" (UniqueName: \"kubernetes.io/projected/ee9f444d-f1ec-493e-bd45-8ac0c395dd9d-kube-api-access-5wqvq\") pod \"ee9f444d-f1ec-493e-bd45-8ac0c395dd9d\" (UID: \"ee9f444d-f1ec-493e-bd45-8ac0c395dd9d\") " Mar 18 16:21:22 crc kubenswrapper[4696]: I0318 16:21:22.946668 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee9f444d-f1ec-493e-bd45-8ac0c395dd9d-utilities\") pod \"ee9f444d-f1ec-493e-bd45-8ac0c395dd9d\" (UID: \"ee9f444d-f1ec-493e-bd45-8ac0c395dd9d\") " Mar 18 16:21:22 crc kubenswrapper[4696]: I0318 16:21:22.947326 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee9f444d-f1ec-493e-bd45-8ac0c395dd9d-utilities" (OuterVolumeSpecName: "utilities") pod "ee9f444d-f1ec-493e-bd45-8ac0c395dd9d" (UID: "ee9f444d-f1ec-493e-bd45-8ac0c395dd9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:21:22 crc kubenswrapper[4696]: I0318 16:21:22.952187 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee9f444d-f1ec-493e-bd45-8ac0c395dd9d-kube-api-access-5wqvq" (OuterVolumeSpecName: "kube-api-access-5wqvq") pod "ee9f444d-f1ec-493e-bd45-8ac0c395dd9d" (UID: "ee9f444d-f1ec-493e-bd45-8ac0c395dd9d"). InnerVolumeSpecName "kube-api-access-5wqvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:21:23 crc kubenswrapper[4696]: I0318 16:21:23.049026 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wqvq\" (UniqueName: \"kubernetes.io/projected/ee9f444d-f1ec-493e-bd45-8ac0c395dd9d-kube-api-access-5wqvq\") on node \"crc\" DevicePath \"\"" Mar 18 16:21:23 crc kubenswrapper[4696]: I0318 16:21:23.049073 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee9f444d-f1ec-493e-bd45-8ac0c395dd9d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:21:23 crc kubenswrapper[4696]: I0318 16:21:23.066621 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee9f444d-f1ec-493e-bd45-8ac0c395dd9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee9f444d-f1ec-493e-bd45-8ac0c395dd9d" (UID: "ee9f444d-f1ec-493e-bd45-8ac0c395dd9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:21:23 crc kubenswrapper[4696]: I0318 16:21:23.150535 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee9f444d-f1ec-493e-bd45-8ac0c395dd9d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:21:23 crc kubenswrapper[4696]: I0318 16:21:23.482589 4696 scope.go:117] "RemoveContainer" containerID="2b4b4cc07141f78984957efc64a65681497ec3b1bf6bd74b1c2b70579efd3cc6" Mar 18 16:21:23 crc kubenswrapper[4696]: I0318 16:21:23.518490 4696 scope.go:117] "RemoveContainer" containerID="945fb57c284158d29a55c61cedee33484bfb1a70ba6d2799581a68133b6ee79e" Mar 18 16:21:23 crc kubenswrapper[4696]: I0318 16:21:23.563472 4696 scope.go:117] "RemoveContainer" containerID="8a0becac611e69353d3ab091a574b3da9c207912c4be9e0031e7af71920f2efd" Mar 18 16:21:23 crc kubenswrapper[4696]: I0318 16:21:23.724087 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gskmp" Mar 18 16:21:23 crc kubenswrapper[4696]: I0318 16:21:23.724111 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gskmp" event={"ID":"ee9f444d-f1ec-493e-bd45-8ac0c395dd9d","Type":"ContainerDied","Data":"29b2810a9988ef00e3b6968f8d183de69fa0759e23b8af3fa660c817b9f1ebc4"} Mar 18 16:21:23 crc kubenswrapper[4696]: I0318 16:21:23.766877 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gskmp"] Mar 18 16:21:23 crc kubenswrapper[4696]: I0318 16:21:23.775542 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gskmp"] Mar 18 16:21:25 crc kubenswrapper[4696]: I0318 16:21:25.608537 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee9f444d-f1ec-493e-bd45-8ac0c395dd9d" path="/var/lib/kubelet/pods/ee9f444d-f1ec-493e-bd45-8ac0c395dd9d/volumes" Mar 18 16:22:00 crc kubenswrapper[4696]: I0318 16:22:00.150205 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564182-xzs5r"] Mar 18 16:22:00 crc kubenswrapper[4696]: E0318 16:22:00.151062 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee9f444d-f1ec-493e-bd45-8ac0c395dd9d" containerName="extract-utilities" Mar 18 16:22:00 crc kubenswrapper[4696]: I0318 16:22:00.151075 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee9f444d-f1ec-493e-bd45-8ac0c395dd9d" containerName="extract-utilities" Mar 18 16:22:00 crc kubenswrapper[4696]: E0318 16:22:00.151087 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee9f444d-f1ec-493e-bd45-8ac0c395dd9d" containerName="registry-server" Mar 18 16:22:00 crc kubenswrapper[4696]: I0318 16:22:00.151093 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee9f444d-f1ec-493e-bd45-8ac0c395dd9d" containerName="registry-server" Mar 18 16:22:00 crc kubenswrapper[4696]: E0318 16:22:00.151117 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee9f444d-f1ec-493e-bd45-8ac0c395dd9d" containerName="extract-content" Mar 18 16:22:00 crc kubenswrapper[4696]: I0318 16:22:00.151123 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee9f444d-f1ec-493e-bd45-8ac0c395dd9d" containerName="extract-content" Mar 18 16:22:00 crc kubenswrapper[4696]: I0318 16:22:00.151332 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee9f444d-f1ec-493e-bd45-8ac0c395dd9d" containerName="registry-server" Mar 18 16:22:00 crc kubenswrapper[4696]: I0318 16:22:00.151954 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564182-xzs5r" Mar 18 16:22:00 crc kubenswrapper[4696]: I0318 16:22:00.154330 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:22:00 crc kubenswrapper[4696]: I0318 16:22:00.154757 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:22:00 crc kubenswrapper[4696]: I0318 16:22:00.155199 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:22:00 crc kubenswrapper[4696]: I0318 16:22:00.167836 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564182-xzs5r"] Mar 18 16:22:00 crc kubenswrapper[4696]: I0318 16:22:00.288794 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gsjn\" (UniqueName: \"kubernetes.io/projected/2b908965-da67-48dc-91df-0a792796b6c8-kube-api-access-4gsjn\") pod \"auto-csr-approver-29564182-xzs5r\" (UID: \"2b908965-da67-48dc-91df-0a792796b6c8\") " pod="openshift-infra/auto-csr-approver-29564182-xzs5r" Mar 18 16:22:00 crc kubenswrapper[4696]: I0318 16:22:00.390915 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gsjn\" (UniqueName: \"kubernetes.io/projected/2b908965-da67-48dc-91df-0a792796b6c8-kube-api-access-4gsjn\") pod \"auto-csr-approver-29564182-xzs5r\" (UID: \"2b908965-da67-48dc-91df-0a792796b6c8\") " pod="openshift-infra/auto-csr-approver-29564182-xzs5r" Mar 18 16:22:00 crc kubenswrapper[4696]: I0318 16:22:00.412748 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gsjn\" (UniqueName: \"kubernetes.io/projected/2b908965-da67-48dc-91df-0a792796b6c8-kube-api-access-4gsjn\") pod \"auto-csr-approver-29564182-xzs5r\" (UID: \"2b908965-da67-48dc-91df-0a792796b6c8\") " pod="openshift-infra/auto-csr-approver-29564182-xzs5r" Mar 18 16:22:00 crc kubenswrapper[4696]: I0318 16:22:00.474690 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564182-xzs5r" Mar 18 16:22:00 crc kubenswrapper[4696]: I0318 16:22:00.905615 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564182-xzs5r"] Mar 18 16:22:01 crc kubenswrapper[4696]: I0318 16:22:01.018251 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564182-xzs5r" event={"ID":"2b908965-da67-48dc-91df-0a792796b6c8","Type":"ContainerStarted","Data":"8a896c73cd8022084be445097add4938135fb55ae53b0eda815690a4655d1ace"} Mar 18 16:22:03 crc kubenswrapper[4696]: I0318 16:22:03.035973 4696 generic.go:334] "Generic (PLEG): container finished" podID="9c5cf28b-0e58-48d1-bd91-2a403201c425" containerID="ff6228584ee1b330d4668ab59866450ba97094e4e7e12d4268645bb99e7f853e" exitCode=0 Mar 18 16:22:03 crc kubenswrapper[4696]: I0318 16:22:03.036044 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" event={"ID":"9c5cf28b-0e58-48d1-bd91-2a403201c425","Type":"ContainerDied","Data":"ff6228584ee1b330d4668ab59866450ba97094e4e7e12d4268645bb99e7f853e"} Mar 18 16:22:04 crc kubenswrapper[4696]: I0318 16:22:04.047667 4696 generic.go:334] "Generic (PLEG): container finished" podID="2b908965-da67-48dc-91df-0a792796b6c8" containerID="48423fda3797c51d170bc3ac024a2a19407874b560c50505b5862c898048d226" exitCode=0 Mar 18 16:22:04 crc kubenswrapper[4696]: I0318 16:22:04.047779 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564182-xzs5r" event={"ID":"2b908965-da67-48dc-91df-0a792796b6c8","Type":"ContainerDied","Data":"48423fda3797c51d170bc3ac024a2a19407874b560c50505b5862c898048d226"} Mar 18 16:22:04 crc kubenswrapper[4696]: I0318 16:22:04.508670 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:22:04 crc kubenswrapper[4696]: I0318 16:22:04.579585 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-ceilometer-compute-config-data-0\") pod \"9c5cf28b-0e58-48d1-bd91-2a403201c425\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " Mar 18 16:22:04 crc kubenswrapper[4696]: I0318 16:22:04.579698 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-ceilometer-compute-config-data-1\") pod \"9c5cf28b-0e58-48d1-bd91-2a403201c425\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " Mar 18 16:22:04 crc kubenswrapper[4696]: I0318 16:22:04.579732 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-ceilometer-compute-config-data-2\") pod \"9c5cf28b-0e58-48d1-bd91-2a403201c425\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " Mar 18 16:22:04 crc kubenswrapper[4696]: I0318 16:22:04.579818 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-inventory\") pod \"9c5cf28b-0e58-48d1-bd91-2a403201c425\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " Mar 18 16:22:04 crc kubenswrapper[4696]: I0318 16:22:04.579860 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-telemetry-combined-ca-bundle\") pod \"9c5cf28b-0e58-48d1-bd91-2a403201c425\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " Mar 18 16:22:04 crc kubenswrapper[4696]: I0318 16:22:04.579887 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmmgw\" (UniqueName: \"kubernetes.io/projected/9c5cf28b-0e58-48d1-bd91-2a403201c425-kube-api-access-wmmgw\") pod \"9c5cf28b-0e58-48d1-bd91-2a403201c425\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " Mar 18 16:22:04 crc kubenswrapper[4696]: I0318 16:22:04.579913 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-ssh-key-openstack-edpm-ipam\") pod \"9c5cf28b-0e58-48d1-bd91-2a403201c425\" (UID: \"9c5cf28b-0e58-48d1-bd91-2a403201c425\") " Mar 18 16:22:04 crc kubenswrapper[4696]: I0318 16:22:04.585040 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "9c5cf28b-0e58-48d1-bd91-2a403201c425" (UID: "9c5cf28b-0e58-48d1-bd91-2a403201c425"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:22:04 crc kubenswrapper[4696]: I0318 16:22:04.585720 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c5cf28b-0e58-48d1-bd91-2a403201c425-kube-api-access-wmmgw" (OuterVolumeSpecName: "kube-api-access-wmmgw") pod "9c5cf28b-0e58-48d1-bd91-2a403201c425" (UID: "9c5cf28b-0e58-48d1-bd91-2a403201c425"). InnerVolumeSpecName "kube-api-access-wmmgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:22:04 crc kubenswrapper[4696]: I0318 16:22:04.607175 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-inventory" (OuterVolumeSpecName: "inventory") pod "9c5cf28b-0e58-48d1-bd91-2a403201c425" (UID: "9c5cf28b-0e58-48d1-bd91-2a403201c425"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:22:04 crc kubenswrapper[4696]: I0318 16:22:04.608280 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9c5cf28b-0e58-48d1-bd91-2a403201c425" (UID: "9c5cf28b-0e58-48d1-bd91-2a403201c425"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:22:04 crc kubenswrapper[4696]: I0318 16:22:04.608898 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "9c5cf28b-0e58-48d1-bd91-2a403201c425" (UID: "9c5cf28b-0e58-48d1-bd91-2a403201c425"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:22:04 crc kubenswrapper[4696]: I0318 16:22:04.609496 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "9c5cf28b-0e58-48d1-bd91-2a403201c425" (UID: "9c5cf28b-0e58-48d1-bd91-2a403201c425"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:22:04 crc kubenswrapper[4696]: I0318 16:22:04.609961 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "9c5cf28b-0e58-48d1-bd91-2a403201c425" (UID: "9c5cf28b-0e58-48d1-bd91-2a403201c425"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:22:04 crc kubenswrapper[4696]: I0318 16:22:04.683177 4696 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 18 16:22:04 crc kubenswrapper[4696]: I0318 16:22:04.683250 4696 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 18 16:22:04 crc kubenswrapper[4696]: I0318 16:22:04.683271 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:22:04 crc kubenswrapper[4696]: I0318 16:22:04.683295 4696 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:22:04 crc kubenswrapper[4696]: I0318 16:22:04.683320 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmmgw\" (UniqueName: \"kubernetes.io/projected/9c5cf28b-0e58-48d1-bd91-2a403201c425-kube-api-access-wmmgw\") on node \"crc\" DevicePath \"\"" Mar 18 16:22:04 crc kubenswrapper[4696]: I0318 16:22:04.683352 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:22:04 crc kubenswrapper[4696]: I0318 16:22:04.683388 4696 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9c5cf28b-0e58-48d1-bd91-2a403201c425-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:22:05 crc kubenswrapper[4696]: I0318 16:22:05.062888 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" event={"ID":"9c5cf28b-0e58-48d1-bd91-2a403201c425","Type":"ContainerDied","Data":"617f9e6b207f61f7e1225ed0cd61570812a30c68c6e186b0ecb08fa1aee02191"} Mar 18 16:22:05 crc kubenswrapper[4696]: I0318 16:22:05.062933 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc" Mar 18 16:22:05 crc kubenswrapper[4696]: I0318 16:22:05.062957 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="617f9e6b207f61f7e1225ed0cd61570812a30c68c6e186b0ecb08fa1aee02191" Mar 18 16:22:05 crc kubenswrapper[4696]: I0318 16:22:05.375817 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564182-xzs5r" Mar 18 16:22:05 crc kubenswrapper[4696]: I0318 16:22:05.395732 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gsjn\" (UniqueName: \"kubernetes.io/projected/2b908965-da67-48dc-91df-0a792796b6c8-kube-api-access-4gsjn\") pod \"2b908965-da67-48dc-91df-0a792796b6c8\" (UID: \"2b908965-da67-48dc-91df-0a792796b6c8\") " Mar 18 16:22:05 crc kubenswrapper[4696]: I0318 16:22:05.400543 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b908965-da67-48dc-91df-0a792796b6c8-kube-api-access-4gsjn" (OuterVolumeSpecName: "kube-api-access-4gsjn") pod "2b908965-da67-48dc-91df-0a792796b6c8" (UID: "2b908965-da67-48dc-91df-0a792796b6c8"). InnerVolumeSpecName "kube-api-access-4gsjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:22:05 crc kubenswrapper[4696]: I0318 16:22:05.498268 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gsjn\" (UniqueName: \"kubernetes.io/projected/2b908965-da67-48dc-91df-0a792796b6c8-kube-api-access-4gsjn\") on node \"crc\" DevicePath \"\"" Mar 18 16:22:06 crc kubenswrapper[4696]: I0318 16:22:06.073823 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564182-xzs5r" event={"ID":"2b908965-da67-48dc-91df-0a792796b6c8","Type":"ContainerDied","Data":"8a896c73cd8022084be445097add4938135fb55ae53b0eda815690a4655d1ace"} Mar 18 16:22:06 crc kubenswrapper[4696]: I0318 16:22:06.073870 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a896c73cd8022084be445097add4938135fb55ae53b0eda815690a4655d1ace" Mar 18 16:22:06 crc kubenswrapper[4696]: I0318 16:22:06.073908 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564182-xzs5r" Mar 18 16:22:06 crc kubenswrapper[4696]: I0318 16:22:06.441785 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564176-mrsqq"] Mar 18 16:22:06 crc kubenswrapper[4696]: I0318 16:22:06.449479 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564176-mrsqq"] Mar 18 16:22:07 crc kubenswrapper[4696]: I0318 16:22:07.609074 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10ff9679-d93d-4d70-b458-12b21f8097d4" path="/var/lib/kubelet/pods/10ff9679-d93d-4d70-b458-12b21f8097d4/volumes" Mar 18 16:22:12 crc kubenswrapper[4696]: I0318 16:22:12.184923 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:22:12 crc kubenswrapper[4696]: I0318 16:22:12.185588 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:22:23 crc kubenswrapper[4696]: I0318 16:22:23.654401 4696 scope.go:117] "RemoveContainer" containerID="f0a5725eeec6b69afec35477fda15f34b84ec462ed65695886a51560cf5251ca" Mar 18 16:22:42 crc kubenswrapper[4696]: I0318 16:22:42.184490 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:22:42 crc kubenswrapper[4696]: I0318 16:22:42.185234 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.031128 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 18 16:23:06 crc kubenswrapper[4696]: E0318 16:23:06.032138 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b908965-da67-48dc-91df-0a792796b6c8" containerName="oc" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.032158 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b908965-da67-48dc-91df-0a792796b6c8" containerName="oc" Mar 18 16:23:06 crc kubenswrapper[4696]: E0318 16:23:06.032183 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c5cf28b-0e58-48d1-bd91-2a403201c425" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.032193 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c5cf28b-0e58-48d1-bd91-2a403201c425" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.032443 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c5cf28b-0e58-48d1-bd91-2a403201c425" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.032466 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b908965-da67-48dc-91df-0a792796b6c8" containerName="oc" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.033122 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.038871 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.038902 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-mh9cr" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.039090 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.039756 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.043847 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.178013 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b6d2f26-746f-404e-817e-ca3b65cc9511-config-data\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.178088 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0b6d2f26-746f-404e-817e-ca3b65cc9511-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.178175 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0b6d2f26-746f-404e-817e-ca3b65cc9511-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.178249 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b6d2f26-746f-404e-817e-ca3b65cc9511-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.178278 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0b6d2f26-746f-404e-817e-ca3b65cc9511-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.178326 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0b6d2f26-746f-404e-817e-ca3b65cc9511-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.178351 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.178398 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0b6d2f26-746f-404e-817e-ca3b65cc9511-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.178424 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4jwc\" (UniqueName: \"kubernetes.io/projected/0b6d2f26-746f-404e-817e-ca3b65cc9511-kube-api-access-h4jwc\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.279879 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0b6d2f26-746f-404e-817e-ca3b65cc9511-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.279933 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4jwc\" (UniqueName: \"kubernetes.io/projected/0b6d2f26-746f-404e-817e-ca3b65cc9511-kube-api-access-h4jwc\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.279970 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b6d2f26-746f-404e-817e-ca3b65cc9511-config-data\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.280008 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0b6d2f26-746f-404e-817e-ca3b65cc9511-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.280042 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0b6d2f26-746f-404e-817e-ca3b65cc9511-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.280098 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b6d2f26-746f-404e-817e-ca3b65cc9511-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.280139 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0b6d2f26-746f-404e-817e-ca3b65cc9511-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.280745 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0b6d2f26-746f-404e-817e-ca3b65cc9511-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.280915 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0b6d2f26-746f-404e-817e-ca3b65cc9511-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.280942 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.281178 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0b6d2f26-746f-404e-817e-ca3b65cc9511-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.281185 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.281227 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0b6d2f26-746f-404e-817e-ca3b65cc9511-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.281304 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b6d2f26-746f-404e-817e-ca3b65cc9511-config-data\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.285561 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b6d2f26-746f-404e-817e-ca3b65cc9511-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.286363 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0b6d2f26-746f-404e-817e-ca3b65cc9511-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.292481 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0b6d2f26-746f-404e-817e-ca3b65cc9511-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.299435 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4jwc\" (UniqueName: \"kubernetes.io/projected/0b6d2f26-746f-404e-817e-ca3b65cc9511-kube-api-access-h4jwc\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.317868 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.358445 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.798464 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 18 16:23:06 crc kubenswrapper[4696]: I0318 16:23:06.812006 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:23:07 crc kubenswrapper[4696]: I0318 16:23:07.409148 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vqh5w"] Mar 18 16:23:07 crc kubenswrapper[4696]: I0318 16:23:07.411062 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqh5w" Mar 18 16:23:07 crc kubenswrapper[4696]: I0318 16:23:07.429381 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vqh5w"] Mar 18 16:23:07 crc kubenswrapper[4696]: I0318 16:23:07.603026 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbd28\" (UniqueName: \"kubernetes.io/projected/e922b989-d6b8-4b09-80ef-a942753a005a-kube-api-access-dbd28\") pod \"community-operators-vqh5w\" (UID: \"e922b989-d6b8-4b09-80ef-a942753a005a\") " pod="openshift-marketplace/community-operators-vqh5w" Mar 18 16:23:07 crc kubenswrapper[4696]: I0318 16:23:07.603755 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e922b989-d6b8-4b09-80ef-a942753a005a-catalog-content\") pod \"community-operators-vqh5w\" (UID: \"e922b989-d6b8-4b09-80ef-a942753a005a\") " pod="openshift-marketplace/community-operators-vqh5w" Mar 18 16:23:07 crc kubenswrapper[4696]: I0318 16:23:07.603956 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e922b989-d6b8-4b09-80ef-a942753a005a-utilities\") pod \"community-operators-vqh5w\" (UID: \"e922b989-d6b8-4b09-80ef-a942753a005a\") " pod="openshift-marketplace/community-operators-vqh5w" Mar 18 16:23:07 crc kubenswrapper[4696]: I0318 16:23:07.658442 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0b6d2f26-746f-404e-817e-ca3b65cc9511","Type":"ContainerStarted","Data":"f6913b29e714cfe5157a85027f2b20af34cd120321718f63a3b3e3d303995ddd"} Mar 18 16:23:07 crc kubenswrapper[4696]: I0318 16:23:07.706304 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e922b989-d6b8-4b09-80ef-a942753a005a-catalog-content\") pod \"community-operators-vqh5w\" (UID: \"e922b989-d6b8-4b09-80ef-a942753a005a\") " pod="openshift-marketplace/community-operators-vqh5w" Mar 18 16:23:07 crc kubenswrapper[4696]: I0318 16:23:07.706445 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e922b989-d6b8-4b09-80ef-a942753a005a-utilities\") pod \"community-operators-vqh5w\" (UID: \"e922b989-d6b8-4b09-80ef-a942753a005a\") " pod="openshift-marketplace/community-operators-vqh5w" Mar 18 16:23:07 crc kubenswrapper[4696]: I0318 16:23:07.706572 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbd28\" (UniqueName: \"kubernetes.io/projected/e922b989-d6b8-4b09-80ef-a942753a005a-kube-api-access-dbd28\") pod \"community-operators-vqh5w\" (UID: \"e922b989-d6b8-4b09-80ef-a942753a005a\") " pod="openshift-marketplace/community-operators-vqh5w" Mar 18 16:23:07 crc kubenswrapper[4696]: I0318 16:23:07.707063 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e922b989-d6b8-4b09-80ef-a942753a005a-catalog-content\") pod \"community-operators-vqh5w\" (UID: \"e922b989-d6b8-4b09-80ef-a942753a005a\") " pod="openshift-marketplace/community-operators-vqh5w" Mar 18 16:23:07 crc kubenswrapper[4696]: I0318 16:23:07.707306 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e922b989-d6b8-4b09-80ef-a942753a005a-utilities\") pod \"community-operators-vqh5w\" (UID: \"e922b989-d6b8-4b09-80ef-a942753a005a\") " pod="openshift-marketplace/community-operators-vqh5w" Mar 18 16:23:07 crc kubenswrapper[4696]: I0318 16:23:07.744438 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbd28\" (UniqueName: \"kubernetes.io/projected/e922b989-d6b8-4b09-80ef-a942753a005a-kube-api-access-dbd28\") pod \"community-operators-vqh5w\" (UID: \"e922b989-d6b8-4b09-80ef-a942753a005a\") " pod="openshift-marketplace/community-operators-vqh5w" Mar 18 16:23:08 crc kubenswrapper[4696]: I0318 16:23:08.041831 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqh5w" Mar 18 16:23:08 crc kubenswrapper[4696]: W0318 16:23:08.517283 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode922b989_d6b8_4b09_80ef_a942753a005a.slice/crio-c8e8f92cca254710d27327d0ca5f44efd3f8831fea1cce3fa45acfb9a5b26c73 WatchSource:0}: Error finding container c8e8f92cca254710d27327d0ca5f44efd3f8831fea1cce3fa45acfb9a5b26c73: Status 404 returned error can't find the container with id c8e8f92cca254710d27327d0ca5f44efd3f8831fea1cce3fa45acfb9a5b26c73 Mar 18 16:23:08 crc kubenswrapper[4696]: I0318 16:23:08.530036 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vqh5w"] Mar 18 16:23:08 crc kubenswrapper[4696]: I0318 16:23:08.669217 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqh5w" event={"ID":"e922b989-d6b8-4b09-80ef-a942753a005a","Type":"ContainerStarted","Data":"c8e8f92cca254710d27327d0ca5f44efd3f8831fea1cce3fa45acfb9a5b26c73"} Mar 18 16:23:09 crc kubenswrapper[4696]: I0318 16:23:09.682088 4696 generic.go:334] "Generic (PLEG): container finished" podID="e922b989-d6b8-4b09-80ef-a942753a005a" containerID="89f1baadffb29287d0837027ceab37b9a7cd2117250e89d0cd1b969c584e4161" exitCode=0 Mar 18 16:23:09 crc kubenswrapper[4696]: I0318 16:23:09.682143 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqh5w" event={"ID":"e922b989-d6b8-4b09-80ef-a942753a005a","Type":"ContainerDied","Data":"89f1baadffb29287d0837027ceab37b9a7cd2117250e89d0cd1b969c584e4161"} Mar 18 16:23:11 crc kubenswrapper[4696]: I0318 16:23:11.616239 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ntgbh"] Mar 18 16:23:11 crc kubenswrapper[4696]: I0318 16:23:11.618341 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ntgbh" Mar 18 16:23:11 crc kubenswrapper[4696]: I0318 16:23:11.627989 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ntgbh"] Mar 18 16:23:11 crc kubenswrapper[4696]: I0318 16:23:11.783267 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e589f63f-6e44-4efd-a82e-9fc7ccaba832-catalog-content\") pod \"certified-operators-ntgbh\" (UID: \"e589f63f-6e44-4efd-a82e-9fc7ccaba832\") " pod="openshift-marketplace/certified-operators-ntgbh" Mar 18 16:23:11 crc kubenswrapper[4696]: I0318 16:23:11.783457 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74mjf\" (UniqueName: \"kubernetes.io/projected/e589f63f-6e44-4efd-a82e-9fc7ccaba832-kube-api-access-74mjf\") pod \"certified-operators-ntgbh\" (UID: \"e589f63f-6e44-4efd-a82e-9fc7ccaba832\") " pod="openshift-marketplace/certified-operators-ntgbh" Mar 18 16:23:11 crc kubenswrapper[4696]: I0318 16:23:11.783498 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e589f63f-6e44-4efd-a82e-9fc7ccaba832-utilities\") pod \"certified-operators-ntgbh\" (UID: \"e589f63f-6e44-4efd-a82e-9fc7ccaba832\") " pod="openshift-marketplace/certified-operators-ntgbh" Mar 18 16:23:11 crc kubenswrapper[4696]: I0318 16:23:11.885856 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e589f63f-6e44-4efd-a82e-9fc7ccaba832-catalog-content\") pod \"certified-operators-ntgbh\" (UID: \"e589f63f-6e44-4efd-a82e-9fc7ccaba832\") " pod="openshift-marketplace/certified-operators-ntgbh" Mar 18 16:23:11 crc kubenswrapper[4696]: I0318 16:23:11.885977 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74mjf\" (UniqueName: \"kubernetes.io/projected/e589f63f-6e44-4efd-a82e-9fc7ccaba832-kube-api-access-74mjf\") pod \"certified-operators-ntgbh\" (UID: \"e589f63f-6e44-4efd-a82e-9fc7ccaba832\") " pod="openshift-marketplace/certified-operators-ntgbh" Mar 18 16:23:11 crc kubenswrapper[4696]: I0318 16:23:11.886022 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e589f63f-6e44-4efd-a82e-9fc7ccaba832-utilities\") pod \"certified-operators-ntgbh\" (UID: \"e589f63f-6e44-4efd-a82e-9fc7ccaba832\") " pod="openshift-marketplace/certified-operators-ntgbh" Mar 18 16:23:11 crc kubenswrapper[4696]: I0318 16:23:11.886504 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e589f63f-6e44-4efd-a82e-9fc7ccaba832-utilities\") pod \"certified-operators-ntgbh\" (UID: \"e589f63f-6e44-4efd-a82e-9fc7ccaba832\") " pod="openshift-marketplace/certified-operators-ntgbh" Mar 18 16:23:11 crc kubenswrapper[4696]: I0318 16:23:11.886793 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e589f63f-6e44-4efd-a82e-9fc7ccaba832-catalog-content\") pod \"certified-operators-ntgbh\" (UID: \"e589f63f-6e44-4efd-a82e-9fc7ccaba832\") " pod="openshift-marketplace/certified-operators-ntgbh" Mar 18 16:23:11 crc kubenswrapper[4696]: I0318 16:23:11.911558 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74mjf\" (UniqueName: \"kubernetes.io/projected/e589f63f-6e44-4efd-a82e-9fc7ccaba832-kube-api-access-74mjf\") pod \"certified-operators-ntgbh\" (UID: \"e589f63f-6e44-4efd-a82e-9fc7ccaba832\") " pod="openshift-marketplace/certified-operators-ntgbh" Mar 18 16:23:11 crc kubenswrapper[4696]: I0318 16:23:11.943978 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ntgbh" Mar 18 16:23:12 crc kubenswrapper[4696]: I0318 16:23:12.184970 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:23:12 crc kubenswrapper[4696]: I0318 16:23:12.185032 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:23:12 crc kubenswrapper[4696]: I0318 16:23:12.185083 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 16:23:12 crc kubenswrapper[4696]: I0318 16:23:12.185860 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"83e9cd26ce0fccbadcc5ae34fe4a8a285026485f53c72a41c064b2bd82e22018"} pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:23:12 crc kubenswrapper[4696]: I0318 16:23:12.185922 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" containerID="cri-o://83e9cd26ce0fccbadcc5ae34fe4a8a285026485f53c72a41c064b2bd82e22018" gracePeriod=600 Mar 18 16:23:13 crc kubenswrapper[4696]: I0318 16:23:13.766788 4696 generic.go:334] "Generic (PLEG): container finished" podID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerID="83e9cd26ce0fccbadcc5ae34fe4a8a285026485f53c72a41c064b2bd82e22018" exitCode=0 Mar 18 16:23:13 crc kubenswrapper[4696]: I0318 16:23:13.766983 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerDied","Data":"83e9cd26ce0fccbadcc5ae34fe4a8a285026485f53c72a41c064b2bd82e22018"} Mar 18 16:23:13 crc kubenswrapper[4696]: I0318 16:23:13.767389 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerStarted","Data":"7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17"} Mar 18 16:23:13 crc kubenswrapper[4696]: I0318 16:23:13.768594 4696 scope.go:117] "RemoveContainer" containerID="9480c03becbc058e08453b1256fd00b3d54837d46bf009254e58bb77002bc48f" Mar 18 16:23:13 crc kubenswrapper[4696]: I0318 16:23:13.957528 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ntgbh"] Mar 18 16:23:14 crc kubenswrapper[4696]: I0318 16:23:14.786904 4696 generic.go:334] "Generic (PLEG): container finished" podID="e589f63f-6e44-4efd-a82e-9fc7ccaba832" containerID="4596606bda07fe6cea0f3cd62376a10617996abe8abcdc6ba37b7cde4cde2fb9" exitCode=0 Mar 18 16:23:14 crc kubenswrapper[4696]: I0318 16:23:14.787385 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntgbh" event={"ID":"e589f63f-6e44-4efd-a82e-9fc7ccaba832","Type":"ContainerDied","Data":"4596606bda07fe6cea0f3cd62376a10617996abe8abcdc6ba37b7cde4cde2fb9"} Mar 18 16:23:14 crc kubenswrapper[4696]: I0318 16:23:14.787541 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntgbh" event={"ID":"e589f63f-6e44-4efd-a82e-9fc7ccaba832","Type":"ContainerStarted","Data":"fcb67deaa0944b85449b3976cf921a5ebf6b7bcd58bc528ac1ceaad358e416a6"} Mar 18 16:23:15 crc kubenswrapper[4696]: I0318 16:23:15.800047 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqh5w" event={"ID":"e922b989-d6b8-4b09-80ef-a942753a005a","Type":"ContainerStarted","Data":"857f715e61395984d9b48eda0c1735b5849704ed0a6913b1a2be33e87e52a9e0"} Mar 18 16:23:16 crc kubenswrapper[4696]: I0318 16:23:16.809836 4696 generic.go:334] "Generic (PLEG): container finished" podID="e922b989-d6b8-4b09-80ef-a942753a005a" containerID="857f715e61395984d9b48eda0c1735b5849704ed0a6913b1a2be33e87e52a9e0" exitCode=0 Mar 18 16:23:16 crc kubenswrapper[4696]: I0318 16:23:16.810046 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqh5w" event={"ID":"e922b989-d6b8-4b09-80ef-a942753a005a","Type":"ContainerDied","Data":"857f715e61395984d9b48eda0c1735b5849704ed0a6913b1a2be33e87e52a9e0"} Mar 18 16:23:18 crc kubenswrapper[4696]: I0318 16:23:18.831325 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqh5w" event={"ID":"e922b989-d6b8-4b09-80ef-a942753a005a","Type":"ContainerStarted","Data":"c5c7d4595e682508b8f4f799723849ffbe123f0f2dddb8d12bb2b48842b36492"} Mar 18 16:23:18 crc kubenswrapper[4696]: I0318 16:23:18.836642 4696 generic.go:334] "Generic (PLEG): container finished" podID="e589f63f-6e44-4efd-a82e-9fc7ccaba832" containerID="d49b570e82f6f7591842ddd729bdc2a00461cdb52ae18926285dcc8d6501007f" exitCode=0 Mar 18 16:23:18 crc kubenswrapper[4696]: I0318 16:23:18.836687 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntgbh" event={"ID":"e589f63f-6e44-4efd-a82e-9fc7ccaba832","Type":"ContainerDied","Data":"d49b570e82f6f7591842ddd729bdc2a00461cdb52ae18926285dcc8d6501007f"} Mar 18 16:23:18 crc kubenswrapper[4696]: I0318 16:23:18.853424 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vqh5w" podStartSLOduration=7.609318715 podStartE2EDuration="11.85339931s" podCreationTimestamp="2026-03-18 16:23:07 +0000 UTC" firstStartedPulling="2026-03-18 16:23:13.422581876 +0000 UTC m=+2836.428756072" lastFinishedPulling="2026-03-18 16:23:17.666662461 +0000 UTC m=+2840.672836667" observedRunningTime="2026-03-18 16:23:18.848227759 +0000 UTC m=+2841.854401975" watchObservedRunningTime="2026-03-18 16:23:18.85339931 +0000 UTC m=+2841.859573546" Mar 18 16:23:28 crc kubenswrapper[4696]: I0318 16:23:28.042875 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vqh5w" Mar 18 16:23:28 crc kubenswrapper[4696]: I0318 16:23:28.043457 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vqh5w" Mar 18 16:23:29 crc kubenswrapper[4696]: I0318 16:23:29.090618 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-vqh5w" podUID="e922b989-d6b8-4b09-80ef-a942753a005a" containerName="registry-server" probeResult="failure" output=< Mar 18 16:23:29 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Mar 18 16:23:29 crc kubenswrapper[4696]: > Mar 18 16:23:37 crc kubenswrapper[4696]: E0318 16:23:37.991758 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 18 16:23:37 crc kubenswrapper[4696]: E0318 16:23:37.992422 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h4jwc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(0b6d2f26-746f-404e-817e-ca3b65cc9511): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 16:23:37 crc kubenswrapper[4696]: E0318 16:23:37.993962 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="0b6d2f26-746f-404e-817e-ca3b65cc9511" Mar 18 16:23:38 crc kubenswrapper[4696]: E0318 16:23:38.029819 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="0b6d2f26-746f-404e-817e-ca3b65cc9511" Mar 18 16:23:39 crc kubenswrapper[4696]: I0318 16:23:39.036883 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntgbh" event={"ID":"e589f63f-6e44-4efd-a82e-9fc7ccaba832","Type":"ContainerStarted","Data":"e7de76314769aa7aa3db4c6b1e73ecd9f39a6d4d38cce0fb8f3f9cb2f7359db5"} Mar 18 16:23:39 crc kubenswrapper[4696]: I0318 16:23:39.062112 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ntgbh" podStartSLOduration=4.923403927 podStartE2EDuration="28.062093097s" podCreationTimestamp="2026-03-18 16:23:11 +0000 UTC" firstStartedPulling="2026-03-18 16:23:14.796225566 +0000 UTC m=+2837.802399772" lastFinishedPulling="2026-03-18 16:23:37.934914746 +0000 UTC m=+2860.941088942" observedRunningTime="2026-03-18 16:23:39.058485226 +0000 UTC m=+2862.064659442" watchObservedRunningTime="2026-03-18 16:23:39.062093097 +0000 UTC m=+2862.068267303" Mar 18 16:23:39 crc kubenswrapper[4696]: I0318 16:23:39.096865 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-vqh5w" podUID="e922b989-d6b8-4b09-80ef-a942753a005a" containerName="registry-server" probeResult="failure" output=< Mar 18 16:23:39 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Mar 18 16:23:39 crc kubenswrapper[4696]: > Mar 18 16:23:41 crc kubenswrapper[4696]: I0318 16:23:41.944496 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ntgbh" Mar 18 16:23:41 crc kubenswrapper[4696]: I0318 16:23:41.944932 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ntgbh" Mar 18 16:23:43 crc kubenswrapper[4696]: I0318 16:23:43.001654 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-ntgbh" podUID="e589f63f-6e44-4efd-a82e-9fc7ccaba832" containerName="registry-server" probeResult="failure" output=< Mar 18 16:23:43 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Mar 18 16:23:43 crc kubenswrapper[4696]: > Mar 18 16:23:48 crc kubenswrapper[4696]: I0318 16:23:48.087100 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vqh5w" Mar 18 16:23:48 crc kubenswrapper[4696]: I0318 16:23:48.133010 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vqh5w" Mar 18 16:23:48 crc kubenswrapper[4696]: I0318 16:23:48.322965 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vqh5w"] Mar 18 16:23:49 crc kubenswrapper[4696]: I0318 16:23:49.122464 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vqh5w" podUID="e922b989-d6b8-4b09-80ef-a942753a005a" containerName="registry-server" containerID="cri-o://c5c7d4595e682508b8f4f799723849ffbe123f0f2dddb8d12bb2b48842b36492" gracePeriod=2 Mar 18 16:23:49 crc kubenswrapper[4696]: I0318 16:23:49.592707 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqh5w" Mar 18 16:23:49 crc kubenswrapper[4696]: I0318 16:23:49.674434 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbd28\" (UniqueName: \"kubernetes.io/projected/e922b989-d6b8-4b09-80ef-a942753a005a-kube-api-access-dbd28\") pod \"e922b989-d6b8-4b09-80ef-a942753a005a\" (UID: \"e922b989-d6b8-4b09-80ef-a942753a005a\") " Mar 18 16:23:49 crc kubenswrapper[4696]: I0318 16:23:49.674480 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e922b989-d6b8-4b09-80ef-a942753a005a-utilities\") pod \"e922b989-d6b8-4b09-80ef-a942753a005a\" (UID: \"e922b989-d6b8-4b09-80ef-a942753a005a\") " Mar 18 16:23:49 crc kubenswrapper[4696]: I0318 16:23:49.674513 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e922b989-d6b8-4b09-80ef-a942753a005a-catalog-content\") pod \"e922b989-d6b8-4b09-80ef-a942753a005a\" (UID: \"e922b989-d6b8-4b09-80ef-a942753a005a\") " Mar 18 16:23:49 crc kubenswrapper[4696]: I0318 16:23:49.682493 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e922b989-d6b8-4b09-80ef-a942753a005a-utilities" (OuterVolumeSpecName: "utilities") pod "e922b989-d6b8-4b09-80ef-a942753a005a" (UID: "e922b989-d6b8-4b09-80ef-a942753a005a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:23:49 crc kubenswrapper[4696]: I0318 16:23:49.693256 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e922b989-d6b8-4b09-80ef-a942753a005a-kube-api-access-dbd28" (OuterVolumeSpecName: "kube-api-access-dbd28") pod "e922b989-d6b8-4b09-80ef-a942753a005a" (UID: "e922b989-d6b8-4b09-80ef-a942753a005a"). InnerVolumeSpecName "kube-api-access-dbd28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:23:49 crc kubenswrapper[4696]: I0318 16:23:49.731341 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e922b989-d6b8-4b09-80ef-a942753a005a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e922b989-d6b8-4b09-80ef-a942753a005a" (UID: "e922b989-d6b8-4b09-80ef-a942753a005a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:23:49 crc kubenswrapper[4696]: I0318 16:23:49.777160 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e922b989-d6b8-4b09-80ef-a942753a005a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:23:49 crc kubenswrapper[4696]: I0318 16:23:49.777236 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbd28\" (UniqueName: \"kubernetes.io/projected/e922b989-d6b8-4b09-80ef-a942753a005a-kube-api-access-dbd28\") on node \"crc\" DevicePath \"\"" Mar 18 16:23:49 crc kubenswrapper[4696]: I0318 16:23:49.777261 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e922b989-d6b8-4b09-80ef-a942753a005a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:23:50 crc kubenswrapper[4696]: I0318 16:23:50.133230 4696 generic.go:334] "Generic (PLEG): container finished" podID="e922b989-d6b8-4b09-80ef-a942753a005a" containerID="c5c7d4595e682508b8f4f799723849ffbe123f0f2dddb8d12bb2b48842b36492" exitCode=0 Mar 18 16:23:50 crc kubenswrapper[4696]: I0318 16:23:50.133274 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqh5w" event={"ID":"e922b989-d6b8-4b09-80ef-a942753a005a","Type":"ContainerDied","Data":"c5c7d4595e682508b8f4f799723849ffbe123f0f2dddb8d12bb2b48842b36492"} Mar 18 16:23:50 crc kubenswrapper[4696]: I0318 16:23:50.133302 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqh5w" event={"ID":"e922b989-d6b8-4b09-80ef-a942753a005a","Type":"ContainerDied","Data":"c8e8f92cca254710d27327d0ca5f44efd3f8831fea1cce3fa45acfb9a5b26c73"} Mar 18 16:23:50 crc kubenswrapper[4696]: I0318 16:23:50.133323 4696 scope.go:117] "RemoveContainer" containerID="c5c7d4595e682508b8f4f799723849ffbe123f0f2dddb8d12bb2b48842b36492" Mar 18 16:23:50 crc kubenswrapper[4696]: I0318 16:23:50.133359 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqh5w" Mar 18 16:23:50 crc kubenswrapper[4696]: I0318 16:23:50.165556 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vqh5w"] Mar 18 16:23:50 crc kubenswrapper[4696]: I0318 16:23:50.167324 4696 scope.go:117] "RemoveContainer" containerID="857f715e61395984d9b48eda0c1735b5849704ed0a6913b1a2be33e87e52a9e0" Mar 18 16:23:50 crc kubenswrapper[4696]: I0318 16:23:50.174448 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vqh5w"] Mar 18 16:23:50 crc kubenswrapper[4696]: I0318 16:23:50.191539 4696 scope.go:117] "RemoveContainer" containerID="89f1baadffb29287d0837027ceab37b9a7cd2117250e89d0cd1b969c584e4161" Mar 18 16:23:50 crc kubenswrapper[4696]: I0318 16:23:50.228662 4696 scope.go:117] "RemoveContainer" containerID="c5c7d4595e682508b8f4f799723849ffbe123f0f2dddb8d12bb2b48842b36492" Mar 18 16:23:50 crc kubenswrapper[4696]: E0318 16:23:50.229124 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5c7d4595e682508b8f4f799723849ffbe123f0f2dddb8d12bb2b48842b36492\": container with ID starting with c5c7d4595e682508b8f4f799723849ffbe123f0f2dddb8d12bb2b48842b36492 not found: ID does not exist" containerID="c5c7d4595e682508b8f4f799723849ffbe123f0f2dddb8d12bb2b48842b36492" Mar 18 16:23:50 crc kubenswrapper[4696]: I0318 16:23:50.229150 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5c7d4595e682508b8f4f799723849ffbe123f0f2dddb8d12bb2b48842b36492"} err="failed to get container status \"c5c7d4595e682508b8f4f799723849ffbe123f0f2dddb8d12bb2b48842b36492\": rpc error: code = NotFound desc = could not find container \"c5c7d4595e682508b8f4f799723849ffbe123f0f2dddb8d12bb2b48842b36492\": container with ID starting with c5c7d4595e682508b8f4f799723849ffbe123f0f2dddb8d12bb2b48842b36492 not found: ID does not exist" Mar 18 16:23:50 crc kubenswrapper[4696]: I0318 16:23:50.229169 4696 scope.go:117] "RemoveContainer" containerID="857f715e61395984d9b48eda0c1735b5849704ed0a6913b1a2be33e87e52a9e0" Mar 18 16:23:50 crc kubenswrapper[4696]: E0318 16:23:50.229556 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"857f715e61395984d9b48eda0c1735b5849704ed0a6913b1a2be33e87e52a9e0\": container with ID starting with 857f715e61395984d9b48eda0c1735b5849704ed0a6913b1a2be33e87e52a9e0 not found: ID does not exist" containerID="857f715e61395984d9b48eda0c1735b5849704ed0a6913b1a2be33e87e52a9e0" Mar 18 16:23:50 crc kubenswrapper[4696]: I0318 16:23:50.229589 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"857f715e61395984d9b48eda0c1735b5849704ed0a6913b1a2be33e87e52a9e0"} err="failed to get container status \"857f715e61395984d9b48eda0c1735b5849704ed0a6913b1a2be33e87e52a9e0\": rpc error: code = NotFound desc = could not find container \"857f715e61395984d9b48eda0c1735b5849704ed0a6913b1a2be33e87e52a9e0\": container with ID starting with 857f715e61395984d9b48eda0c1735b5849704ed0a6913b1a2be33e87e52a9e0 not found: ID does not exist" Mar 18 16:23:50 crc kubenswrapper[4696]: I0318 16:23:50.229606 4696 scope.go:117] "RemoveContainer" containerID="89f1baadffb29287d0837027ceab37b9a7cd2117250e89d0cd1b969c584e4161" Mar 18 16:23:50 crc kubenswrapper[4696]: E0318 16:23:50.229823 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89f1baadffb29287d0837027ceab37b9a7cd2117250e89d0cd1b969c584e4161\": container with ID starting with 89f1baadffb29287d0837027ceab37b9a7cd2117250e89d0cd1b969c584e4161 not found: ID does not exist" containerID="89f1baadffb29287d0837027ceab37b9a7cd2117250e89d0cd1b969c584e4161" Mar 18 16:23:50 crc kubenswrapper[4696]: I0318 16:23:50.229840 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89f1baadffb29287d0837027ceab37b9a7cd2117250e89d0cd1b969c584e4161"} err="failed to get container status \"89f1baadffb29287d0837027ceab37b9a7cd2117250e89d0cd1b969c584e4161\": rpc error: code = NotFound desc = could not find container \"89f1baadffb29287d0837027ceab37b9a7cd2117250e89d0cd1b969c584e4161\": container with ID starting with 89f1baadffb29287d0837027ceab37b9a7cd2117250e89d0cd1b969c584e4161 not found: ID does not exist" Mar 18 16:23:51 crc kubenswrapper[4696]: I0318 16:23:51.608970 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e922b989-d6b8-4b09-80ef-a942753a005a" path="/var/lib/kubelet/pods/e922b989-d6b8-4b09-80ef-a942753a005a/volumes" Mar 18 16:23:52 crc kubenswrapper[4696]: I0318 16:23:52.993575 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-ntgbh" podUID="e589f63f-6e44-4efd-a82e-9fc7ccaba832" containerName="registry-server" probeResult="failure" output=< Mar 18 16:23:52 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Mar 18 16:23:52 crc kubenswrapper[4696]: > Mar 18 16:23:55 crc kubenswrapper[4696]: I0318 16:23:55.186423 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0b6d2f26-746f-404e-817e-ca3b65cc9511","Type":"ContainerStarted","Data":"902bc91e69e9e251b788b2b65ef20041bdb77846792913500c92c9f1a127bac6"} Mar 18 16:23:55 crc kubenswrapper[4696]: I0318 16:23:55.211095 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.99352421 podStartE2EDuration="51.211073487s" podCreationTimestamp="2026-03-18 16:23:04 +0000 UTC" firstStartedPulling="2026-03-18 16:23:06.811768295 +0000 UTC m=+2829.817942501" lastFinishedPulling="2026-03-18 16:23:53.029317552 +0000 UTC m=+2876.035491778" observedRunningTime="2026-03-18 16:23:55.207765444 +0000 UTC m=+2878.213939640" watchObservedRunningTime="2026-03-18 16:23:55.211073487 +0000 UTC m=+2878.217247693" Mar 18 16:24:00 crc kubenswrapper[4696]: I0318 16:24:00.141448 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564184-nwvrp"] Mar 18 16:24:00 crc kubenswrapper[4696]: E0318 16:24:00.142498 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e922b989-d6b8-4b09-80ef-a942753a005a" containerName="extract-utilities" Mar 18 16:24:00 crc kubenswrapper[4696]: I0318 16:24:00.142604 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="e922b989-d6b8-4b09-80ef-a942753a005a" containerName="extract-utilities" Mar 18 16:24:00 crc kubenswrapper[4696]: E0318 16:24:00.142651 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e922b989-d6b8-4b09-80ef-a942753a005a" containerName="registry-server" Mar 18 16:24:00 crc kubenswrapper[4696]: I0318 16:24:00.142660 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="e922b989-d6b8-4b09-80ef-a942753a005a" containerName="registry-server" Mar 18 16:24:00 crc kubenswrapper[4696]: E0318 16:24:00.142685 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e922b989-d6b8-4b09-80ef-a942753a005a" containerName="extract-content" Mar 18 16:24:00 crc kubenswrapper[4696]: I0318 16:24:00.142695 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="e922b989-d6b8-4b09-80ef-a942753a005a" containerName="extract-content" Mar 18 16:24:00 crc kubenswrapper[4696]: I0318 16:24:00.142942 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="e922b989-d6b8-4b09-80ef-a942753a005a" containerName="registry-server" Mar 18 16:24:00 crc kubenswrapper[4696]: I0318 16:24:00.143666 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564184-nwvrp" Mar 18 16:24:00 crc kubenswrapper[4696]: I0318 16:24:00.146086 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:24:00 crc kubenswrapper[4696]: I0318 16:24:00.146274 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:24:00 crc kubenswrapper[4696]: I0318 16:24:00.148077 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:24:00 crc kubenswrapper[4696]: I0318 16:24:00.154287 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564184-nwvrp"] Mar 18 16:24:00 crc kubenswrapper[4696]: I0318 16:24:00.312559 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8hwz\" (UniqueName: \"kubernetes.io/projected/a4ba1a70-fee4-454d-9862-2f7a41fb5a5b-kube-api-access-z8hwz\") pod \"auto-csr-approver-29564184-nwvrp\" (UID: \"a4ba1a70-fee4-454d-9862-2f7a41fb5a5b\") " pod="openshift-infra/auto-csr-approver-29564184-nwvrp" Mar 18 16:24:00 crc kubenswrapper[4696]: I0318 16:24:00.414430 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8hwz\" (UniqueName: \"kubernetes.io/projected/a4ba1a70-fee4-454d-9862-2f7a41fb5a5b-kube-api-access-z8hwz\") pod \"auto-csr-approver-29564184-nwvrp\" (UID: \"a4ba1a70-fee4-454d-9862-2f7a41fb5a5b\") " pod="openshift-infra/auto-csr-approver-29564184-nwvrp" Mar 18 16:24:00 crc kubenswrapper[4696]: I0318 16:24:00.432755 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8hwz\" (UniqueName: \"kubernetes.io/projected/a4ba1a70-fee4-454d-9862-2f7a41fb5a5b-kube-api-access-z8hwz\") pod \"auto-csr-approver-29564184-nwvrp\" (UID: \"a4ba1a70-fee4-454d-9862-2f7a41fb5a5b\") " pod="openshift-infra/auto-csr-approver-29564184-nwvrp" Mar 18 16:24:00 crc kubenswrapper[4696]: I0318 16:24:00.463925 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564184-nwvrp" Mar 18 16:24:00 crc kubenswrapper[4696]: I0318 16:24:00.903399 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564184-nwvrp"] Mar 18 16:24:01 crc kubenswrapper[4696]: I0318 16:24:01.238678 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564184-nwvrp" event={"ID":"a4ba1a70-fee4-454d-9862-2f7a41fb5a5b","Type":"ContainerStarted","Data":"7fe9d79dbc50416a88c1a2f5c42f9ce8b4dc1a3e5497a57f5027af8db7c62d4d"} Mar 18 16:24:01 crc kubenswrapper[4696]: I0318 16:24:01.990384 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ntgbh" Mar 18 16:24:02 crc kubenswrapper[4696]: I0318 16:24:02.041849 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ntgbh" Mar 18 16:24:02 crc kubenswrapper[4696]: I0318 16:24:02.227121 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ntgbh"] Mar 18 16:24:02 crc kubenswrapper[4696]: I0318 16:24:02.249637 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564184-nwvrp" event={"ID":"a4ba1a70-fee4-454d-9862-2f7a41fb5a5b","Type":"ContainerStarted","Data":"1dae12c06edce4d6cf4dba9ca25045a3acb8f0ca1e2511435d11a44bb750d8e6"} Mar 18 16:24:02 crc kubenswrapper[4696]: I0318 16:24:02.272060 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564184-nwvrp" podStartSLOduration=1.270267775 podStartE2EDuration="2.272039031s" podCreationTimestamp="2026-03-18 16:24:00 +0000 UTC" firstStartedPulling="2026-03-18 16:24:00.9122624 +0000 UTC m=+2883.918436606" lastFinishedPulling="2026-03-18 16:24:01.914033666 +0000 UTC m=+2884.920207862" observedRunningTime="2026-03-18 16:24:02.271705422 +0000 UTC m=+2885.277879628" watchObservedRunningTime="2026-03-18 16:24:02.272039031 +0000 UTC m=+2885.278213237" Mar 18 16:24:03 crc kubenswrapper[4696]: I0318 16:24:03.260337 4696 generic.go:334] "Generic (PLEG): container finished" podID="a4ba1a70-fee4-454d-9862-2f7a41fb5a5b" containerID="1dae12c06edce4d6cf4dba9ca25045a3acb8f0ca1e2511435d11a44bb750d8e6" exitCode=0 Mar 18 16:24:03 crc kubenswrapper[4696]: I0318 16:24:03.260459 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564184-nwvrp" event={"ID":"a4ba1a70-fee4-454d-9862-2f7a41fb5a5b","Type":"ContainerDied","Data":"1dae12c06edce4d6cf4dba9ca25045a3acb8f0ca1e2511435d11a44bb750d8e6"} Mar 18 16:24:03 crc kubenswrapper[4696]: I0318 16:24:03.262220 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ntgbh" podUID="e589f63f-6e44-4efd-a82e-9fc7ccaba832" containerName="registry-server" containerID="cri-o://e7de76314769aa7aa3db4c6b1e73ecd9f39a6d4d38cce0fb8f3f9cb2f7359db5" gracePeriod=2 Mar 18 16:24:03 crc kubenswrapper[4696]: I0318 16:24:03.778408 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ntgbh" Mar 18 16:24:03 crc kubenswrapper[4696]: I0318 16:24:03.790820 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74mjf\" (UniqueName: \"kubernetes.io/projected/e589f63f-6e44-4efd-a82e-9fc7ccaba832-kube-api-access-74mjf\") pod \"e589f63f-6e44-4efd-a82e-9fc7ccaba832\" (UID: \"e589f63f-6e44-4efd-a82e-9fc7ccaba832\") " Mar 18 16:24:03 crc kubenswrapper[4696]: I0318 16:24:03.791044 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e589f63f-6e44-4efd-a82e-9fc7ccaba832-utilities\") pod \"e589f63f-6e44-4efd-a82e-9fc7ccaba832\" (UID: \"e589f63f-6e44-4efd-a82e-9fc7ccaba832\") " Mar 18 16:24:03 crc kubenswrapper[4696]: I0318 16:24:03.791206 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e589f63f-6e44-4efd-a82e-9fc7ccaba832-catalog-content\") pod \"e589f63f-6e44-4efd-a82e-9fc7ccaba832\" (UID: \"e589f63f-6e44-4efd-a82e-9fc7ccaba832\") " Mar 18 16:24:03 crc kubenswrapper[4696]: I0318 16:24:03.791734 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e589f63f-6e44-4efd-a82e-9fc7ccaba832-utilities" (OuterVolumeSpecName: "utilities") pod "e589f63f-6e44-4efd-a82e-9fc7ccaba832" (UID: "e589f63f-6e44-4efd-a82e-9fc7ccaba832"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:24:03 crc kubenswrapper[4696]: I0318 16:24:03.798229 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e589f63f-6e44-4efd-a82e-9fc7ccaba832-kube-api-access-74mjf" (OuterVolumeSpecName: "kube-api-access-74mjf") pod "e589f63f-6e44-4efd-a82e-9fc7ccaba832" (UID: "e589f63f-6e44-4efd-a82e-9fc7ccaba832"). InnerVolumeSpecName "kube-api-access-74mjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:24:03 crc kubenswrapper[4696]: I0318 16:24:03.858354 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e589f63f-6e44-4efd-a82e-9fc7ccaba832-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e589f63f-6e44-4efd-a82e-9fc7ccaba832" (UID: "e589f63f-6e44-4efd-a82e-9fc7ccaba832"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:24:03 crc kubenswrapper[4696]: I0318 16:24:03.892898 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74mjf\" (UniqueName: \"kubernetes.io/projected/e589f63f-6e44-4efd-a82e-9fc7ccaba832-kube-api-access-74mjf\") on node \"crc\" DevicePath \"\"" Mar 18 16:24:03 crc kubenswrapper[4696]: I0318 16:24:03.893108 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e589f63f-6e44-4efd-a82e-9fc7ccaba832-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:24:03 crc kubenswrapper[4696]: I0318 16:24:03.893168 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e589f63f-6e44-4efd-a82e-9fc7ccaba832-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:24:04 crc kubenswrapper[4696]: I0318 16:24:04.273716 4696 generic.go:334] "Generic (PLEG): container finished" podID="e589f63f-6e44-4efd-a82e-9fc7ccaba832" containerID="e7de76314769aa7aa3db4c6b1e73ecd9f39a6d4d38cce0fb8f3f9cb2f7359db5" exitCode=0 Mar 18 16:24:04 crc kubenswrapper[4696]: I0318 16:24:04.273768 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntgbh" event={"ID":"e589f63f-6e44-4efd-a82e-9fc7ccaba832","Type":"ContainerDied","Data":"e7de76314769aa7aa3db4c6b1e73ecd9f39a6d4d38cce0fb8f3f9cb2f7359db5"} Mar 18 16:24:04 crc kubenswrapper[4696]: I0318 16:24:04.273818 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntgbh" event={"ID":"e589f63f-6e44-4efd-a82e-9fc7ccaba832","Type":"ContainerDied","Data":"fcb67deaa0944b85449b3976cf921a5ebf6b7bcd58bc528ac1ceaad358e416a6"} Mar 18 16:24:04 crc kubenswrapper[4696]: I0318 16:24:04.273816 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ntgbh" Mar 18 16:24:04 crc kubenswrapper[4696]: I0318 16:24:04.273833 4696 scope.go:117] "RemoveContainer" containerID="e7de76314769aa7aa3db4c6b1e73ecd9f39a6d4d38cce0fb8f3f9cb2f7359db5" Mar 18 16:24:04 crc kubenswrapper[4696]: I0318 16:24:04.308550 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ntgbh"] Mar 18 16:24:04 crc kubenswrapper[4696]: I0318 16:24:04.310650 4696 scope.go:117] "RemoveContainer" containerID="d49b570e82f6f7591842ddd729bdc2a00461cdb52ae18926285dcc8d6501007f" Mar 18 16:24:04 crc kubenswrapper[4696]: I0318 16:24:04.317500 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ntgbh"] Mar 18 16:24:04 crc kubenswrapper[4696]: I0318 16:24:04.367717 4696 scope.go:117] "RemoveContainer" containerID="4596606bda07fe6cea0f3cd62376a10617996abe8abcdc6ba37b7cde4cde2fb9" Mar 18 16:24:04 crc kubenswrapper[4696]: I0318 16:24:04.401050 4696 scope.go:117] "RemoveContainer" containerID="e7de76314769aa7aa3db4c6b1e73ecd9f39a6d4d38cce0fb8f3f9cb2f7359db5" Mar 18 16:24:04 crc kubenswrapper[4696]: E0318 16:24:04.405751 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7de76314769aa7aa3db4c6b1e73ecd9f39a6d4d38cce0fb8f3f9cb2f7359db5\": container with ID starting with e7de76314769aa7aa3db4c6b1e73ecd9f39a6d4d38cce0fb8f3f9cb2f7359db5 not found: ID does not exist" containerID="e7de76314769aa7aa3db4c6b1e73ecd9f39a6d4d38cce0fb8f3f9cb2f7359db5" Mar 18 16:24:04 crc kubenswrapper[4696]: I0318 16:24:04.405801 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7de76314769aa7aa3db4c6b1e73ecd9f39a6d4d38cce0fb8f3f9cb2f7359db5"} err="failed to get container status \"e7de76314769aa7aa3db4c6b1e73ecd9f39a6d4d38cce0fb8f3f9cb2f7359db5\": rpc error: code = NotFound desc = could not find container \"e7de76314769aa7aa3db4c6b1e73ecd9f39a6d4d38cce0fb8f3f9cb2f7359db5\": container with ID starting with e7de76314769aa7aa3db4c6b1e73ecd9f39a6d4d38cce0fb8f3f9cb2f7359db5 not found: ID does not exist" Mar 18 16:24:04 crc kubenswrapper[4696]: I0318 16:24:04.405829 4696 scope.go:117] "RemoveContainer" containerID="d49b570e82f6f7591842ddd729bdc2a00461cdb52ae18926285dcc8d6501007f" Mar 18 16:24:04 crc kubenswrapper[4696]: E0318 16:24:04.406424 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d49b570e82f6f7591842ddd729bdc2a00461cdb52ae18926285dcc8d6501007f\": container with ID starting with d49b570e82f6f7591842ddd729bdc2a00461cdb52ae18926285dcc8d6501007f not found: ID does not exist" containerID="d49b570e82f6f7591842ddd729bdc2a00461cdb52ae18926285dcc8d6501007f" Mar 18 16:24:04 crc kubenswrapper[4696]: I0318 16:24:04.406464 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d49b570e82f6f7591842ddd729bdc2a00461cdb52ae18926285dcc8d6501007f"} err="failed to get container status \"d49b570e82f6f7591842ddd729bdc2a00461cdb52ae18926285dcc8d6501007f\": rpc error: code = NotFound desc = could not find container \"d49b570e82f6f7591842ddd729bdc2a00461cdb52ae18926285dcc8d6501007f\": container with ID starting with d49b570e82f6f7591842ddd729bdc2a00461cdb52ae18926285dcc8d6501007f not found: ID does not exist" Mar 18 16:24:04 crc kubenswrapper[4696]: I0318 16:24:04.406493 4696 scope.go:117] "RemoveContainer" containerID="4596606bda07fe6cea0f3cd62376a10617996abe8abcdc6ba37b7cde4cde2fb9" Mar 18 16:24:04 crc kubenswrapper[4696]: E0318 16:24:04.408899 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4596606bda07fe6cea0f3cd62376a10617996abe8abcdc6ba37b7cde4cde2fb9\": container with ID starting with 4596606bda07fe6cea0f3cd62376a10617996abe8abcdc6ba37b7cde4cde2fb9 not found: ID does not exist" containerID="4596606bda07fe6cea0f3cd62376a10617996abe8abcdc6ba37b7cde4cde2fb9" Mar 18 16:24:04 crc kubenswrapper[4696]: I0318 16:24:04.408941 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4596606bda07fe6cea0f3cd62376a10617996abe8abcdc6ba37b7cde4cde2fb9"} err="failed to get container status \"4596606bda07fe6cea0f3cd62376a10617996abe8abcdc6ba37b7cde4cde2fb9\": rpc error: code = NotFound desc = could not find container \"4596606bda07fe6cea0f3cd62376a10617996abe8abcdc6ba37b7cde4cde2fb9\": container with ID starting with 4596606bda07fe6cea0f3cd62376a10617996abe8abcdc6ba37b7cde4cde2fb9 not found: ID does not exist" Mar 18 16:24:04 crc kubenswrapper[4696]: I0318 16:24:04.687461 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564184-nwvrp" Mar 18 16:24:04 crc kubenswrapper[4696]: I0318 16:24:04.708112 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8hwz\" (UniqueName: \"kubernetes.io/projected/a4ba1a70-fee4-454d-9862-2f7a41fb5a5b-kube-api-access-z8hwz\") pod \"a4ba1a70-fee4-454d-9862-2f7a41fb5a5b\" (UID: \"a4ba1a70-fee4-454d-9862-2f7a41fb5a5b\") " Mar 18 16:24:04 crc kubenswrapper[4696]: I0318 16:24:04.723027 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4ba1a70-fee4-454d-9862-2f7a41fb5a5b-kube-api-access-z8hwz" (OuterVolumeSpecName: "kube-api-access-z8hwz") pod "a4ba1a70-fee4-454d-9862-2f7a41fb5a5b" (UID: "a4ba1a70-fee4-454d-9862-2f7a41fb5a5b"). InnerVolumeSpecName "kube-api-access-z8hwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:24:04 crc kubenswrapper[4696]: I0318 16:24:04.810189 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8hwz\" (UniqueName: \"kubernetes.io/projected/a4ba1a70-fee4-454d-9862-2f7a41fb5a5b-kube-api-access-z8hwz\") on node \"crc\" DevicePath \"\"" Mar 18 16:24:05 crc kubenswrapper[4696]: I0318 16:24:05.283815 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564184-nwvrp" event={"ID":"a4ba1a70-fee4-454d-9862-2f7a41fb5a5b","Type":"ContainerDied","Data":"7fe9d79dbc50416a88c1a2f5c42f9ce8b4dc1a3e5497a57f5027af8db7c62d4d"} Mar 18 16:24:05 crc kubenswrapper[4696]: I0318 16:24:05.283864 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fe9d79dbc50416a88c1a2f5c42f9ce8b4dc1a3e5497a57f5027af8db7c62d4d" Mar 18 16:24:05 crc kubenswrapper[4696]: I0318 16:24:05.283858 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564184-nwvrp" Mar 18 16:24:05 crc kubenswrapper[4696]: I0318 16:24:05.347755 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564178-jsvx7"] Mar 18 16:24:05 crc kubenswrapper[4696]: I0318 16:24:05.355346 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564178-jsvx7"] Mar 18 16:24:05 crc kubenswrapper[4696]: I0318 16:24:05.616280 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f4d5b69-5589-4f08-a153-7e0ae2644a8b" path="/var/lib/kubelet/pods/9f4d5b69-5589-4f08-a153-7e0ae2644a8b/volumes" Mar 18 16:24:05 crc kubenswrapper[4696]: I0318 16:24:05.618011 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e589f63f-6e44-4efd-a82e-9fc7ccaba832" path="/var/lib/kubelet/pods/e589f63f-6e44-4efd-a82e-9fc7ccaba832/volumes" Mar 18 16:24:23 crc kubenswrapper[4696]: I0318 16:24:23.743731 4696 scope.go:117] "RemoveContainer" containerID="6f94622a5a371de6fb44807053cce9daa30dafc1c62e0c05bb1150579fc27f44" Mar 18 16:25:42 crc kubenswrapper[4696]: I0318 16:25:42.184333 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:25:42 crc kubenswrapper[4696]: I0318 16:25:42.184906 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:26:00 crc kubenswrapper[4696]: I0318 16:26:00.140371 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564186-wbbt8"] Mar 18 16:26:00 crc kubenswrapper[4696]: E0318 16:26:00.141316 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4ba1a70-fee4-454d-9862-2f7a41fb5a5b" containerName="oc" Mar 18 16:26:00 crc kubenswrapper[4696]: I0318 16:26:00.141329 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ba1a70-fee4-454d-9862-2f7a41fb5a5b" containerName="oc" Mar 18 16:26:00 crc kubenswrapper[4696]: E0318 16:26:00.141340 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e589f63f-6e44-4efd-a82e-9fc7ccaba832" containerName="extract-utilities" Mar 18 16:26:00 crc kubenswrapper[4696]: I0318 16:26:00.141346 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="e589f63f-6e44-4efd-a82e-9fc7ccaba832" containerName="extract-utilities" Mar 18 16:26:00 crc kubenswrapper[4696]: E0318 16:26:00.141353 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e589f63f-6e44-4efd-a82e-9fc7ccaba832" containerName="registry-server" Mar 18 16:26:00 crc kubenswrapper[4696]: I0318 16:26:00.141360 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="e589f63f-6e44-4efd-a82e-9fc7ccaba832" containerName="registry-server" Mar 18 16:26:00 crc kubenswrapper[4696]: E0318 16:26:00.141384 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e589f63f-6e44-4efd-a82e-9fc7ccaba832" containerName="extract-content" Mar 18 16:26:00 crc kubenswrapper[4696]: I0318 16:26:00.141390 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="e589f63f-6e44-4efd-a82e-9fc7ccaba832" containerName="extract-content" Mar 18 16:26:00 crc kubenswrapper[4696]: I0318 16:26:00.141660 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="e589f63f-6e44-4efd-a82e-9fc7ccaba832" containerName="registry-server" Mar 18 16:26:00 crc kubenswrapper[4696]: I0318 16:26:00.141689 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4ba1a70-fee4-454d-9862-2f7a41fb5a5b" containerName="oc" Mar 18 16:26:00 crc kubenswrapper[4696]: I0318 16:26:00.142309 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564186-wbbt8" Mar 18 16:26:00 crc kubenswrapper[4696]: I0318 16:26:00.146295 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:26:00 crc kubenswrapper[4696]: I0318 16:26:00.146725 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:26:00 crc kubenswrapper[4696]: I0318 16:26:00.146918 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:26:00 crc kubenswrapper[4696]: I0318 16:26:00.162810 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564186-wbbt8"] Mar 18 16:26:00 crc kubenswrapper[4696]: I0318 16:26:00.258742 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzrpn\" (UniqueName: \"kubernetes.io/projected/62b6ea30-7269-4c46-90ed-a62cf6c99113-kube-api-access-vzrpn\") pod \"auto-csr-approver-29564186-wbbt8\" (UID: \"62b6ea30-7269-4c46-90ed-a62cf6c99113\") " pod="openshift-infra/auto-csr-approver-29564186-wbbt8" Mar 18 16:26:00 crc kubenswrapper[4696]: I0318 16:26:00.361844 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzrpn\" (UniqueName: \"kubernetes.io/projected/62b6ea30-7269-4c46-90ed-a62cf6c99113-kube-api-access-vzrpn\") pod \"auto-csr-approver-29564186-wbbt8\" (UID: \"62b6ea30-7269-4c46-90ed-a62cf6c99113\") " pod="openshift-infra/auto-csr-approver-29564186-wbbt8" Mar 18 16:26:00 crc kubenswrapper[4696]: I0318 16:26:00.382003 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzrpn\" (UniqueName: \"kubernetes.io/projected/62b6ea30-7269-4c46-90ed-a62cf6c99113-kube-api-access-vzrpn\") pod \"auto-csr-approver-29564186-wbbt8\" (UID: \"62b6ea30-7269-4c46-90ed-a62cf6c99113\") " pod="openshift-infra/auto-csr-approver-29564186-wbbt8" Mar 18 16:26:00 crc kubenswrapper[4696]: I0318 16:26:00.465152 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564186-wbbt8" Mar 18 16:26:00 crc kubenswrapper[4696]: I0318 16:26:00.927663 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564186-wbbt8"] Mar 18 16:26:01 crc kubenswrapper[4696]: I0318 16:26:01.646509 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564186-wbbt8" event={"ID":"62b6ea30-7269-4c46-90ed-a62cf6c99113","Type":"ContainerStarted","Data":"108004c0c54c0afee4b7cc7807ccec7c438270ab4adc95379db6834ca1e70ca8"} Mar 18 16:26:02 crc kubenswrapper[4696]: I0318 16:26:02.658778 4696 generic.go:334] "Generic (PLEG): container finished" podID="62b6ea30-7269-4c46-90ed-a62cf6c99113" containerID="91e8d0fab6492e0c99653f16429f4d95802608e433e63e037c93c3791146cefa" exitCode=0 Mar 18 16:26:02 crc kubenswrapper[4696]: I0318 16:26:02.658828 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564186-wbbt8" event={"ID":"62b6ea30-7269-4c46-90ed-a62cf6c99113","Type":"ContainerDied","Data":"91e8d0fab6492e0c99653f16429f4d95802608e433e63e037c93c3791146cefa"} Mar 18 16:26:04 crc kubenswrapper[4696]: I0318 16:26:04.014226 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564186-wbbt8" Mar 18 16:26:04 crc kubenswrapper[4696]: I0318 16:26:04.165225 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzrpn\" (UniqueName: \"kubernetes.io/projected/62b6ea30-7269-4c46-90ed-a62cf6c99113-kube-api-access-vzrpn\") pod \"62b6ea30-7269-4c46-90ed-a62cf6c99113\" (UID: \"62b6ea30-7269-4c46-90ed-a62cf6c99113\") " Mar 18 16:26:04 crc kubenswrapper[4696]: I0318 16:26:04.170596 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62b6ea30-7269-4c46-90ed-a62cf6c99113-kube-api-access-vzrpn" (OuterVolumeSpecName: "kube-api-access-vzrpn") pod "62b6ea30-7269-4c46-90ed-a62cf6c99113" (UID: "62b6ea30-7269-4c46-90ed-a62cf6c99113"). InnerVolumeSpecName "kube-api-access-vzrpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:26:04 crc kubenswrapper[4696]: I0318 16:26:04.267642 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzrpn\" (UniqueName: \"kubernetes.io/projected/62b6ea30-7269-4c46-90ed-a62cf6c99113-kube-api-access-vzrpn\") on node \"crc\" DevicePath \"\"" Mar 18 16:26:04 crc kubenswrapper[4696]: I0318 16:26:04.677024 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564186-wbbt8" event={"ID":"62b6ea30-7269-4c46-90ed-a62cf6c99113","Type":"ContainerDied","Data":"108004c0c54c0afee4b7cc7807ccec7c438270ab4adc95379db6834ca1e70ca8"} Mar 18 16:26:04 crc kubenswrapper[4696]: I0318 16:26:04.677694 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="108004c0c54c0afee4b7cc7807ccec7c438270ab4adc95379db6834ca1e70ca8" Mar 18 16:26:04 crc kubenswrapper[4696]: I0318 16:26:04.677146 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564186-wbbt8" Mar 18 16:26:05 crc kubenswrapper[4696]: I0318 16:26:05.084394 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564180-h964z"] Mar 18 16:26:05 crc kubenswrapper[4696]: I0318 16:26:05.094275 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564180-h964z"] Mar 18 16:26:05 crc kubenswrapper[4696]: I0318 16:26:05.609859 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d33cc811-e3e0-4900-b5fd-579e91e48191" path="/var/lib/kubelet/pods/d33cc811-e3e0-4900-b5fd-579e91e48191/volumes" Mar 18 16:26:12 crc kubenswrapper[4696]: I0318 16:26:12.184476 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:26:12 crc kubenswrapper[4696]: I0318 16:26:12.185101 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:26:23 crc kubenswrapper[4696]: I0318 16:26:23.889859 4696 scope.go:117] "RemoveContainer" containerID="c28386e8ee0ab605c159b0bce181c7962f6cfcefc6db939644831dde1693159b" Mar 18 16:26:42 crc kubenswrapper[4696]: I0318 16:26:42.185089 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:26:42 crc kubenswrapper[4696]: I0318 16:26:42.185701 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:26:42 crc kubenswrapper[4696]: I0318 16:26:42.185756 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 16:26:42 crc kubenswrapper[4696]: I0318 16:26:42.186496 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17"} pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:26:42 crc kubenswrapper[4696]: I0318 16:26:42.186578 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" containerID="cri-o://7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" gracePeriod=600 Mar 18 16:26:42 crc kubenswrapper[4696]: E0318 16:26:42.311540 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:26:43 crc kubenswrapper[4696]: I0318 16:26:43.025393 4696 generic.go:334] "Generic (PLEG): container finished" podID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" exitCode=0 Mar 18 16:26:43 crc kubenswrapper[4696]: I0318 16:26:43.025451 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerDied","Data":"7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17"} Mar 18 16:26:43 crc kubenswrapper[4696]: I0318 16:26:43.026043 4696 scope.go:117] "RemoveContainer" containerID="83e9cd26ce0fccbadcc5ae34fe4a8a285026485f53c72a41c064b2bd82e22018" Mar 18 16:26:43 crc kubenswrapper[4696]: I0318 16:26:43.027057 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:26:43 crc kubenswrapper[4696]: E0318 16:26:43.027423 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:26:57 crc kubenswrapper[4696]: I0318 16:26:57.598126 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:26:57 crc kubenswrapper[4696]: E0318 16:26:57.598976 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:27:09 crc kubenswrapper[4696]: I0318 16:27:09.598664 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:27:09 crc kubenswrapper[4696]: E0318 16:27:09.599896 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:27:20 crc kubenswrapper[4696]: I0318 16:27:20.597505 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:27:20 crc kubenswrapper[4696]: E0318 16:27:20.598418 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:27:33 crc kubenswrapper[4696]: I0318 16:27:33.597985 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:27:33 crc kubenswrapper[4696]: E0318 16:27:33.598790 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:27:47 crc kubenswrapper[4696]: I0318 16:27:47.603988 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:27:47 crc kubenswrapper[4696]: E0318 16:27:47.604799 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:28:00 crc kubenswrapper[4696]: I0318 16:28:00.147554 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564188-2zs7b"] Mar 18 16:28:00 crc kubenswrapper[4696]: E0318 16:28:00.148677 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62b6ea30-7269-4c46-90ed-a62cf6c99113" containerName="oc" Mar 18 16:28:00 crc kubenswrapper[4696]: I0318 16:28:00.148695 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="62b6ea30-7269-4c46-90ed-a62cf6c99113" containerName="oc" Mar 18 16:28:00 crc kubenswrapper[4696]: I0318 16:28:00.148925 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="62b6ea30-7269-4c46-90ed-a62cf6c99113" containerName="oc" Mar 18 16:28:00 crc kubenswrapper[4696]: I0318 16:28:00.149749 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564188-2zs7b" Mar 18 16:28:00 crc kubenswrapper[4696]: I0318 16:28:00.153000 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:28:00 crc kubenswrapper[4696]: I0318 16:28:00.153199 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:28:00 crc kubenswrapper[4696]: I0318 16:28:00.153345 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:28:00 crc kubenswrapper[4696]: I0318 16:28:00.158484 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564188-2zs7b"] Mar 18 16:28:00 crc kubenswrapper[4696]: I0318 16:28:00.300356 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z4f6\" (UniqueName: \"kubernetes.io/projected/674dbf40-fb2b-40a7-9019-ff7c8eac34c2-kube-api-access-5z4f6\") pod \"auto-csr-approver-29564188-2zs7b\" (UID: \"674dbf40-fb2b-40a7-9019-ff7c8eac34c2\") " pod="openshift-infra/auto-csr-approver-29564188-2zs7b" Mar 18 16:28:00 crc kubenswrapper[4696]: I0318 16:28:00.402318 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z4f6\" (UniqueName: \"kubernetes.io/projected/674dbf40-fb2b-40a7-9019-ff7c8eac34c2-kube-api-access-5z4f6\") pod \"auto-csr-approver-29564188-2zs7b\" (UID: \"674dbf40-fb2b-40a7-9019-ff7c8eac34c2\") " pod="openshift-infra/auto-csr-approver-29564188-2zs7b" Mar 18 16:28:00 crc kubenswrapper[4696]: I0318 16:28:00.435921 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z4f6\" (UniqueName: \"kubernetes.io/projected/674dbf40-fb2b-40a7-9019-ff7c8eac34c2-kube-api-access-5z4f6\") pod \"auto-csr-approver-29564188-2zs7b\" (UID: \"674dbf40-fb2b-40a7-9019-ff7c8eac34c2\") " pod="openshift-infra/auto-csr-approver-29564188-2zs7b" Mar 18 16:28:00 crc kubenswrapper[4696]: I0318 16:28:00.470382 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564188-2zs7b" Mar 18 16:28:00 crc kubenswrapper[4696]: I0318 16:28:00.929985 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564188-2zs7b"] Mar 18 16:28:01 crc kubenswrapper[4696]: I0318 16:28:01.598230 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:28:01 crc kubenswrapper[4696]: E0318 16:28:01.598746 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:28:01 crc kubenswrapper[4696]: I0318 16:28:01.934895 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564188-2zs7b" event={"ID":"674dbf40-fb2b-40a7-9019-ff7c8eac34c2","Type":"ContainerStarted","Data":"3400899ad4a1ec9194565a4c4ded7590fcc5801cc6698fd5a4359c2ef2494692"} Mar 18 16:28:02 crc kubenswrapper[4696]: I0318 16:28:02.947684 4696 generic.go:334] "Generic (PLEG): container finished" podID="674dbf40-fb2b-40a7-9019-ff7c8eac34c2" containerID="a42f33bc9910bd5b726d6177ab62670f66b774347a9d6012f6b1e0687c0c2e46" exitCode=0 Mar 18 16:28:02 crc kubenswrapper[4696]: I0318 16:28:02.947771 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564188-2zs7b" event={"ID":"674dbf40-fb2b-40a7-9019-ff7c8eac34c2","Type":"ContainerDied","Data":"a42f33bc9910bd5b726d6177ab62670f66b774347a9d6012f6b1e0687c0c2e46"} Mar 18 16:28:04 crc kubenswrapper[4696]: I0318 16:28:04.318011 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564188-2zs7b" Mar 18 16:28:04 crc kubenswrapper[4696]: I0318 16:28:04.503496 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5z4f6\" (UniqueName: \"kubernetes.io/projected/674dbf40-fb2b-40a7-9019-ff7c8eac34c2-kube-api-access-5z4f6\") pod \"674dbf40-fb2b-40a7-9019-ff7c8eac34c2\" (UID: \"674dbf40-fb2b-40a7-9019-ff7c8eac34c2\") " Mar 18 16:28:04 crc kubenswrapper[4696]: I0318 16:28:04.509009 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/674dbf40-fb2b-40a7-9019-ff7c8eac34c2-kube-api-access-5z4f6" (OuterVolumeSpecName: "kube-api-access-5z4f6") pod "674dbf40-fb2b-40a7-9019-ff7c8eac34c2" (UID: "674dbf40-fb2b-40a7-9019-ff7c8eac34c2"). InnerVolumeSpecName "kube-api-access-5z4f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:28:04 crc kubenswrapper[4696]: I0318 16:28:04.606168 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5z4f6\" (UniqueName: \"kubernetes.io/projected/674dbf40-fb2b-40a7-9019-ff7c8eac34c2-kube-api-access-5z4f6\") on node \"crc\" DevicePath \"\"" Mar 18 16:28:04 crc kubenswrapper[4696]: I0318 16:28:04.966401 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564188-2zs7b" event={"ID":"674dbf40-fb2b-40a7-9019-ff7c8eac34c2","Type":"ContainerDied","Data":"3400899ad4a1ec9194565a4c4ded7590fcc5801cc6698fd5a4359c2ef2494692"} Mar 18 16:28:04 crc kubenswrapper[4696]: I0318 16:28:04.966736 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3400899ad4a1ec9194565a4c4ded7590fcc5801cc6698fd5a4359c2ef2494692" Mar 18 16:28:04 crc kubenswrapper[4696]: I0318 16:28:04.966460 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564188-2zs7b" Mar 18 16:28:05 crc kubenswrapper[4696]: I0318 16:28:05.381544 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564182-xzs5r"] Mar 18 16:28:05 crc kubenswrapper[4696]: I0318 16:28:05.389872 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564182-xzs5r"] Mar 18 16:28:05 crc kubenswrapper[4696]: I0318 16:28:05.610307 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b908965-da67-48dc-91df-0a792796b6c8" path="/var/lib/kubelet/pods/2b908965-da67-48dc-91df-0a792796b6c8/volumes" Mar 18 16:28:12 crc kubenswrapper[4696]: I0318 16:28:12.598465 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:28:12 crc kubenswrapper[4696]: E0318 16:28:12.599294 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:28:24 crc kubenswrapper[4696]: I0318 16:28:24.037153 4696 scope.go:117] "RemoveContainer" containerID="48423fda3797c51d170bc3ac024a2a19407874b560c50505b5862c898048d226" Mar 18 16:28:26 crc kubenswrapper[4696]: I0318 16:28:26.597946 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:28:26 crc kubenswrapper[4696]: E0318 16:28:26.598495 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:28:41 crc kubenswrapper[4696]: I0318 16:28:41.597225 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:28:41 crc kubenswrapper[4696]: E0318 16:28:41.598041 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:28:56 crc kubenswrapper[4696]: I0318 16:28:56.597982 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:28:56 crc kubenswrapper[4696]: E0318 16:28:56.599131 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:29:07 crc kubenswrapper[4696]: I0318 16:29:07.609386 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:29:07 crc kubenswrapper[4696]: E0318 16:29:07.610453 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:29:22 crc kubenswrapper[4696]: I0318 16:29:22.597683 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:29:22 crc kubenswrapper[4696]: E0318 16:29:22.599205 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:29:37 crc kubenswrapper[4696]: I0318 16:29:37.597321 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:29:37 crc kubenswrapper[4696]: E0318 16:29:37.598125 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:29:51 crc kubenswrapper[4696]: I0318 16:29:51.597464 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:29:51 crc kubenswrapper[4696]: E0318 16:29:51.598252 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.182374 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564190-jfx4w"] Mar 18 16:30:00 crc kubenswrapper[4696]: E0318 16:30:00.184016 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="674dbf40-fb2b-40a7-9019-ff7c8eac34c2" containerName="oc" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.184045 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="674dbf40-fb2b-40a7-9019-ff7c8eac34c2" containerName="oc" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.186094 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="674dbf40-fb2b-40a7-9019-ff7c8eac34c2" containerName="oc" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.187254 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564190-jfx4w" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.189682 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.189883 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.190052 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.193310 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564190-cx2s8"] Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.195365 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-cx2s8" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.198328 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.198397 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.201554 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564190-cx2s8"] Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.210628 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564190-jfx4w"] Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.279386 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eae3e26f-0313-4b3f-b260-4b665073e29c-config-volume\") pod \"collect-profiles-29564190-cx2s8\" (UID: \"eae3e26f-0313-4b3f-b260-4b665073e29c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-cx2s8" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.279475 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkhsk\" (UniqueName: \"kubernetes.io/projected/e065bd04-e1af-434e-bc60-2c45b41de3d4-kube-api-access-tkhsk\") pod \"auto-csr-approver-29564190-jfx4w\" (UID: \"e065bd04-e1af-434e-bc60-2c45b41de3d4\") " pod="openshift-infra/auto-csr-approver-29564190-jfx4w" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.279601 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eae3e26f-0313-4b3f-b260-4b665073e29c-secret-volume\") pod \"collect-profiles-29564190-cx2s8\" (UID: \"eae3e26f-0313-4b3f-b260-4b665073e29c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-cx2s8" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.279719 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hcrx\" (UniqueName: \"kubernetes.io/projected/eae3e26f-0313-4b3f-b260-4b665073e29c-kube-api-access-8hcrx\") pod \"collect-profiles-29564190-cx2s8\" (UID: \"eae3e26f-0313-4b3f-b260-4b665073e29c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-cx2s8" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.381633 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkhsk\" (UniqueName: \"kubernetes.io/projected/e065bd04-e1af-434e-bc60-2c45b41de3d4-kube-api-access-tkhsk\") pod \"auto-csr-approver-29564190-jfx4w\" (UID: \"e065bd04-e1af-434e-bc60-2c45b41de3d4\") " pod="openshift-infra/auto-csr-approver-29564190-jfx4w" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.382430 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eae3e26f-0313-4b3f-b260-4b665073e29c-secret-volume\") pod \"collect-profiles-29564190-cx2s8\" (UID: \"eae3e26f-0313-4b3f-b260-4b665073e29c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-cx2s8" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.382623 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hcrx\" (UniqueName: \"kubernetes.io/projected/eae3e26f-0313-4b3f-b260-4b665073e29c-kube-api-access-8hcrx\") pod \"collect-profiles-29564190-cx2s8\" (UID: \"eae3e26f-0313-4b3f-b260-4b665073e29c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-cx2s8" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.382774 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eae3e26f-0313-4b3f-b260-4b665073e29c-config-volume\") pod \"collect-profiles-29564190-cx2s8\" (UID: \"eae3e26f-0313-4b3f-b260-4b665073e29c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-cx2s8" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.384513 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eae3e26f-0313-4b3f-b260-4b665073e29c-config-volume\") pod \"collect-profiles-29564190-cx2s8\" (UID: \"eae3e26f-0313-4b3f-b260-4b665073e29c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-cx2s8" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.392644 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eae3e26f-0313-4b3f-b260-4b665073e29c-secret-volume\") pod \"collect-profiles-29564190-cx2s8\" (UID: \"eae3e26f-0313-4b3f-b260-4b665073e29c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-cx2s8" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.399542 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkhsk\" (UniqueName: \"kubernetes.io/projected/e065bd04-e1af-434e-bc60-2c45b41de3d4-kube-api-access-tkhsk\") pod \"auto-csr-approver-29564190-jfx4w\" (UID: \"e065bd04-e1af-434e-bc60-2c45b41de3d4\") " pod="openshift-infra/auto-csr-approver-29564190-jfx4w" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.400789 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hcrx\" (UniqueName: \"kubernetes.io/projected/eae3e26f-0313-4b3f-b260-4b665073e29c-kube-api-access-8hcrx\") pod \"collect-profiles-29564190-cx2s8\" (UID: \"eae3e26f-0313-4b3f-b260-4b665073e29c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-cx2s8" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.509122 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564190-jfx4w" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.523043 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-cx2s8" Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.980994 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564190-jfx4w"] Mar 18 16:30:00 crc kubenswrapper[4696]: I0318 16:30:00.987286 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:30:01 crc kubenswrapper[4696]: I0318 16:30:01.083131 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564190-jfx4w" event={"ID":"e065bd04-e1af-434e-bc60-2c45b41de3d4","Type":"ContainerStarted","Data":"ddbaec6ee27209e701e83d4fa896a72cf57aba3a95b663ca3662bd950f9a39fc"} Mar 18 16:30:01 crc kubenswrapper[4696]: I0318 16:30:01.085201 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564190-cx2s8"] Mar 18 16:30:01 crc kubenswrapper[4696]: W0318 16:30:01.088836 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeae3e26f_0313_4b3f_b260_4b665073e29c.slice/crio-d3616e0b41b701e32bbe4dbd5ec42c31d3df238872183e7b4b3ac2b56bc1c044 WatchSource:0}: Error finding container d3616e0b41b701e32bbe4dbd5ec42c31d3df238872183e7b4b3ac2b56bc1c044: Status 404 returned error can't find the container with id d3616e0b41b701e32bbe4dbd5ec42c31d3df238872183e7b4b3ac2b56bc1c044 Mar 18 16:30:02 crc kubenswrapper[4696]: I0318 16:30:02.096930 4696 generic.go:334] "Generic (PLEG): container finished" podID="eae3e26f-0313-4b3f-b260-4b665073e29c" containerID="bc06faec1cb1ea5bcf91033cb7d31a548cfc590af1bdb0f477e76a13f81ff6ec" exitCode=0 Mar 18 16:30:02 crc kubenswrapper[4696]: I0318 16:30:02.097004 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-cx2s8" event={"ID":"eae3e26f-0313-4b3f-b260-4b665073e29c","Type":"ContainerDied","Data":"bc06faec1cb1ea5bcf91033cb7d31a548cfc590af1bdb0f477e76a13f81ff6ec"} Mar 18 16:30:02 crc kubenswrapper[4696]: I0318 16:30:02.098034 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-cx2s8" event={"ID":"eae3e26f-0313-4b3f-b260-4b665073e29c","Type":"ContainerStarted","Data":"d3616e0b41b701e32bbe4dbd5ec42c31d3df238872183e7b4b3ac2b56bc1c044"} Mar 18 16:30:03 crc kubenswrapper[4696]: I0318 16:30:03.106277 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564190-jfx4w" event={"ID":"e065bd04-e1af-434e-bc60-2c45b41de3d4","Type":"ContainerStarted","Data":"abddb6636fdf0314ab2ed4753401581e6c0d02f041c33fd4f21bb9a61c7ae616"} Mar 18 16:30:03 crc kubenswrapper[4696]: I0318 16:30:03.597771 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:30:03 crc kubenswrapper[4696]: E0318 16:30:03.598399 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:30:03 crc kubenswrapper[4696]: I0318 16:30:03.678876 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-cx2s8" Mar 18 16:30:03 crc kubenswrapper[4696]: I0318 16:30:03.699987 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564190-jfx4w" podStartSLOduration=2.007838751 podStartE2EDuration="3.699966553s" podCreationTimestamp="2026-03-18 16:30:00 +0000 UTC" firstStartedPulling="2026-03-18 16:30:00.987079732 +0000 UTC m=+3243.993253938" lastFinishedPulling="2026-03-18 16:30:02.679207524 +0000 UTC m=+3245.685381740" observedRunningTime="2026-03-18 16:30:03.144043589 +0000 UTC m=+3246.150217805" watchObservedRunningTime="2026-03-18 16:30:03.699966553 +0000 UTC m=+3246.706140759" Mar 18 16:30:03 crc kubenswrapper[4696]: I0318 16:30:03.755840 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eae3e26f-0313-4b3f-b260-4b665073e29c-secret-volume\") pod \"eae3e26f-0313-4b3f-b260-4b665073e29c\" (UID: \"eae3e26f-0313-4b3f-b260-4b665073e29c\") " Mar 18 16:30:03 crc kubenswrapper[4696]: I0318 16:30:03.755939 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eae3e26f-0313-4b3f-b260-4b665073e29c-config-volume\") pod \"eae3e26f-0313-4b3f-b260-4b665073e29c\" (UID: \"eae3e26f-0313-4b3f-b260-4b665073e29c\") " Mar 18 16:30:03 crc kubenswrapper[4696]: I0318 16:30:03.756153 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hcrx\" (UniqueName: \"kubernetes.io/projected/eae3e26f-0313-4b3f-b260-4b665073e29c-kube-api-access-8hcrx\") pod \"eae3e26f-0313-4b3f-b260-4b665073e29c\" (UID: \"eae3e26f-0313-4b3f-b260-4b665073e29c\") " Mar 18 16:30:03 crc kubenswrapper[4696]: I0318 16:30:03.756858 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eae3e26f-0313-4b3f-b260-4b665073e29c-config-volume" (OuterVolumeSpecName: "config-volume") pod "eae3e26f-0313-4b3f-b260-4b665073e29c" (UID: "eae3e26f-0313-4b3f-b260-4b665073e29c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:30:03 crc kubenswrapper[4696]: I0318 16:30:03.762959 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eae3e26f-0313-4b3f-b260-4b665073e29c-kube-api-access-8hcrx" (OuterVolumeSpecName: "kube-api-access-8hcrx") pod "eae3e26f-0313-4b3f-b260-4b665073e29c" (UID: "eae3e26f-0313-4b3f-b260-4b665073e29c"). InnerVolumeSpecName "kube-api-access-8hcrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:30:03 crc kubenswrapper[4696]: I0318 16:30:03.762981 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eae3e26f-0313-4b3f-b260-4b665073e29c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "eae3e26f-0313-4b3f-b260-4b665073e29c" (UID: "eae3e26f-0313-4b3f-b260-4b665073e29c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:30:03 crc kubenswrapper[4696]: I0318 16:30:03.858857 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hcrx\" (UniqueName: \"kubernetes.io/projected/eae3e26f-0313-4b3f-b260-4b665073e29c-kube-api-access-8hcrx\") on node \"crc\" DevicePath \"\"" Mar 18 16:30:03 crc kubenswrapper[4696]: I0318 16:30:03.858918 4696 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eae3e26f-0313-4b3f-b260-4b665073e29c-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 16:30:03 crc kubenswrapper[4696]: I0318 16:30:03.858930 4696 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eae3e26f-0313-4b3f-b260-4b665073e29c-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 16:30:04 crc kubenswrapper[4696]: I0318 16:30:04.115855 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-cx2s8" event={"ID":"eae3e26f-0313-4b3f-b260-4b665073e29c","Type":"ContainerDied","Data":"d3616e0b41b701e32bbe4dbd5ec42c31d3df238872183e7b4b3ac2b56bc1c044"} Mar 18 16:30:04 crc kubenswrapper[4696]: I0318 16:30:04.115925 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3616e0b41b701e32bbe4dbd5ec42c31d3df238872183e7b4b3ac2b56bc1c044" Mar 18 16:30:04 crc kubenswrapper[4696]: I0318 16:30:04.115876 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-cx2s8" Mar 18 16:30:04 crc kubenswrapper[4696]: I0318 16:30:04.117473 4696 generic.go:334] "Generic (PLEG): container finished" podID="e065bd04-e1af-434e-bc60-2c45b41de3d4" containerID="abddb6636fdf0314ab2ed4753401581e6c0d02f041c33fd4f21bb9a61c7ae616" exitCode=0 Mar 18 16:30:04 crc kubenswrapper[4696]: I0318 16:30:04.117501 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564190-jfx4w" event={"ID":"e065bd04-e1af-434e-bc60-2c45b41de3d4","Type":"ContainerDied","Data":"abddb6636fdf0314ab2ed4753401581e6c0d02f041c33fd4f21bb9a61c7ae616"} Mar 18 16:30:04 crc kubenswrapper[4696]: I0318 16:30:04.744580 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564145-6jvbt"] Mar 18 16:30:04 crc kubenswrapper[4696]: I0318 16:30:04.755037 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564145-6jvbt"] Mar 18 16:30:05 crc kubenswrapper[4696]: I0318 16:30:05.476579 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564190-jfx4w" Mar 18 16:30:05 crc kubenswrapper[4696]: I0318 16:30:05.490895 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkhsk\" (UniqueName: \"kubernetes.io/projected/e065bd04-e1af-434e-bc60-2c45b41de3d4-kube-api-access-tkhsk\") pod \"e065bd04-e1af-434e-bc60-2c45b41de3d4\" (UID: \"e065bd04-e1af-434e-bc60-2c45b41de3d4\") " Mar 18 16:30:05 crc kubenswrapper[4696]: I0318 16:30:05.495383 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e065bd04-e1af-434e-bc60-2c45b41de3d4-kube-api-access-tkhsk" (OuterVolumeSpecName: "kube-api-access-tkhsk") pod "e065bd04-e1af-434e-bc60-2c45b41de3d4" (UID: "e065bd04-e1af-434e-bc60-2c45b41de3d4"). InnerVolumeSpecName "kube-api-access-tkhsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:30:05 crc kubenswrapper[4696]: I0318 16:30:05.593051 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkhsk\" (UniqueName: \"kubernetes.io/projected/e065bd04-e1af-434e-bc60-2c45b41de3d4-kube-api-access-tkhsk\") on node \"crc\" DevicePath \"\"" Mar 18 16:30:05 crc kubenswrapper[4696]: I0318 16:30:05.610936 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fd2f649-6dd8-413f-85fe-36cd6e6cea88" path="/var/lib/kubelet/pods/5fd2f649-6dd8-413f-85fe-36cd6e6cea88/volumes" Mar 18 16:30:06 crc kubenswrapper[4696]: I0318 16:30:06.137092 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564190-jfx4w" event={"ID":"e065bd04-e1af-434e-bc60-2c45b41de3d4","Type":"ContainerDied","Data":"ddbaec6ee27209e701e83d4fa896a72cf57aba3a95b663ca3662bd950f9a39fc"} Mar 18 16:30:06 crc kubenswrapper[4696]: I0318 16:30:06.137138 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddbaec6ee27209e701e83d4fa896a72cf57aba3a95b663ca3662bd950f9a39fc" Mar 18 16:30:06 crc kubenswrapper[4696]: I0318 16:30:06.137165 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564190-jfx4w" Mar 18 16:30:06 crc kubenswrapper[4696]: I0318 16:30:06.535019 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564184-nwvrp"] Mar 18 16:30:06 crc kubenswrapper[4696]: I0318 16:30:06.542588 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564184-nwvrp"] Mar 18 16:30:07 crc kubenswrapper[4696]: I0318 16:30:07.610265 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4ba1a70-fee4-454d-9862-2f7a41fb5a5b" path="/var/lib/kubelet/pods/a4ba1a70-fee4-454d-9862-2f7a41fb5a5b/volumes" Mar 18 16:30:16 crc kubenswrapper[4696]: I0318 16:30:16.596910 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:30:16 crc kubenswrapper[4696]: E0318 16:30:16.597509 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:30:24 crc kubenswrapper[4696]: I0318 16:30:24.160936 4696 scope.go:117] "RemoveContainer" containerID="1dae12c06edce4d6cf4dba9ca25045a3acb8f0ca1e2511435d11a44bb750d8e6" Mar 18 16:30:24 crc kubenswrapper[4696]: I0318 16:30:24.220425 4696 scope.go:117] "RemoveContainer" containerID="37da1252302ec3ef393f23519f05d6f604ae011e69b569458030fa3e62f7d98b" Mar 18 16:30:28 crc kubenswrapper[4696]: I0318 16:30:28.597383 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:30:28 crc kubenswrapper[4696]: E0318 16:30:28.598241 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:30:35 crc kubenswrapper[4696]: I0318 16:30:35.757449 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c25v7"] Mar 18 16:30:35 crc kubenswrapper[4696]: E0318 16:30:35.758505 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e065bd04-e1af-434e-bc60-2c45b41de3d4" containerName="oc" Mar 18 16:30:35 crc kubenswrapper[4696]: I0318 16:30:35.758569 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="e065bd04-e1af-434e-bc60-2c45b41de3d4" containerName="oc" Mar 18 16:30:35 crc kubenswrapper[4696]: E0318 16:30:35.758604 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eae3e26f-0313-4b3f-b260-4b665073e29c" containerName="collect-profiles" Mar 18 16:30:35 crc kubenswrapper[4696]: I0318 16:30:35.758613 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="eae3e26f-0313-4b3f-b260-4b665073e29c" containerName="collect-profiles" Mar 18 16:30:35 crc kubenswrapper[4696]: I0318 16:30:35.758823 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="e065bd04-e1af-434e-bc60-2c45b41de3d4" containerName="oc" Mar 18 16:30:35 crc kubenswrapper[4696]: I0318 16:30:35.758842 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="eae3e26f-0313-4b3f-b260-4b665073e29c" containerName="collect-profiles" Mar 18 16:30:35 crc kubenswrapper[4696]: I0318 16:30:35.761067 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c25v7" Mar 18 16:30:35 crc kubenswrapper[4696]: I0318 16:30:35.789220 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c25v7"] Mar 18 16:30:35 crc kubenswrapper[4696]: I0318 16:30:35.942682 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddcmm\" (UniqueName: \"kubernetes.io/projected/0190bcdc-e3c8-483b-a47b-fb38128aa80f-kube-api-access-ddcmm\") pod \"redhat-marketplace-c25v7\" (UID: \"0190bcdc-e3c8-483b-a47b-fb38128aa80f\") " pod="openshift-marketplace/redhat-marketplace-c25v7" Mar 18 16:30:35 crc kubenswrapper[4696]: I0318 16:30:35.942756 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0190bcdc-e3c8-483b-a47b-fb38128aa80f-utilities\") pod \"redhat-marketplace-c25v7\" (UID: \"0190bcdc-e3c8-483b-a47b-fb38128aa80f\") " pod="openshift-marketplace/redhat-marketplace-c25v7" Mar 18 16:30:35 crc kubenswrapper[4696]: I0318 16:30:35.942839 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0190bcdc-e3c8-483b-a47b-fb38128aa80f-catalog-content\") pod \"redhat-marketplace-c25v7\" (UID: \"0190bcdc-e3c8-483b-a47b-fb38128aa80f\") " pod="openshift-marketplace/redhat-marketplace-c25v7" Mar 18 16:30:36 crc kubenswrapper[4696]: I0318 16:30:36.045441 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0190bcdc-e3c8-483b-a47b-fb38128aa80f-utilities\") pod \"redhat-marketplace-c25v7\" (UID: \"0190bcdc-e3c8-483b-a47b-fb38128aa80f\") " pod="openshift-marketplace/redhat-marketplace-c25v7" Mar 18 16:30:36 crc kubenswrapper[4696]: I0318 16:30:36.045559 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0190bcdc-e3c8-483b-a47b-fb38128aa80f-catalog-content\") pod \"redhat-marketplace-c25v7\" (UID: \"0190bcdc-e3c8-483b-a47b-fb38128aa80f\") " pod="openshift-marketplace/redhat-marketplace-c25v7" Mar 18 16:30:36 crc kubenswrapper[4696]: I0318 16:30:36.045723 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddcmm\" (UniqueName: \"kubernetes.io/projected/0190bcdc-e3c8-483b-a47b-fb38128aa80f-kube-api-access-ddcmm\") pod \"redhat-marketplace-c25v7\" (UID: \"0190bcdc-e3c8-483b-a47b-fb38128aa80f\") " pod="openshift-marketplace/redhat-marketplace-c25v7" Mar 18 16:30:36 crc kubenswrapper[4696]: I0318 16:30:36.046094 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0190bcdc-e3c8-483b-a47b-fb38128aa80f-utilities\") pod \"redhat-marketplace-c25v7\" (UID: \"0190bcdc-e3c8-483b-a47b-fb38128aa80f\") " pod="openshift-marketplace/redhat-marketplace-c25v7" Mar 18 16:30:36 crc kubenswrapper[4696]: I0318 16:30:36.046478 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0190bcdc-e3c8-483b-a47b-fb38128aa80f-catalog-content\") pod \"redhat-marketplace-c25v7\" (UID: \"0190bcdc-e3c8-483b-a47b-fb38128aa80f\") " pod="openshift-marketplace/redhat-marketplace-c25v7" Mar 18 16:30:36 crc kubenswrapper[4696]: I0318 16:30:36.067091 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddcmm\" (UniqueName: \"kubernetes.io/projected/0190bcdc-e3c8-483b-a47b-fb38128aa80f-kube-api-access-ddcmm\") pod \"redhat-marketplace-c25v7\" (UID: \"0190bcdc-e3c8-483b-a47b-fb38128aa80f\") " pod="openshift-marketplace/redhat-marketplace-c25v7" Mar 18 16:30:36 crc kubenswrapper[4696]: I0318 16:30:36.086166 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c25v7" Mar 18 16:30:36 crc kubenswrapper[4696]: I0318 16:30:36.551192 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c25v7"] Mar 18 16:30:37 crc kubenswrapper[4696]: I0318 16:30:37.431816 4696 generic.go:334] "Generic (PLEG): container finished" podID="0190bcdc-e3c8-483b-a47b-fb38128aa80f" containerID="826ac7aa5aa6a7a81cb85042e0deba9a6a332c0a167c6a80430de2c0fdf4220e" exitCode=0 Mar 18 16:30:37 crc kubenswrapper[4696]: I0318 16:30:37.431961 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c25v7" event={"ID":"0190bcdc-e3c8-483b-a47b-fb38128aa80f","Type":"ContainerDied","Data":"826ac7aa5aa6a7a81cb85042e0deba9a6a332c0a167c6a80430de2c0fdf4220e"} Mar 18 16:30:37 crc kubenswrapper[4696]: I0318 16:30:37.432075 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c25v7" event={"ID":"0190bcdc-e3c8-483b-a47b-fb38128aa80f","Type":"ContainerStarted","Data":"2c59c780be0db38835c36fd241404d7584e8871e326b6d2688bb3109f5d5d0cb"} Mar 18 16:30:38 crc kubenswrapper[4696]: I0318 16:30:38.444578 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c25v7" event={"ID":"0190bcdc-e3c8-483b-a47b-fb38128aa80f","Type":"ContainerStarted","Data":"420d282090e4f8feac8f55f1cefabb5cb9e598622b854650bcb4957ee563534b"} Mar 18 16:30:39 crc kubenswrapper[4696]: I0318 16:30:39.454880 4696 generic.go:334] "Generic (PLEG): container finished" podID="0190bcdc-e3c8-483b-a47b-fb38128aa80f" containerID="420d282090e4f8feac8f55f1cefabb5cb9e598622b854650bcb4957ee563534b" exitCode=0 Mar 18 16:30:39 crc kubenswrapper[4696]: I0318 16:30:39.454921 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c25v7" event={"ID":"0190bcdc-e3c8-483b-a47b-fb38128aa80f","Type":"ContainerDied","Data":"420d282090e4f8feac8f55f1cefabb5cb9e598622b854650bcb4957ee563534b"} Mar 18 16:30:40 crc kubenswrapper[4696]: I0318 16:30:40.465662 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c25v7" event={"ID":"0190bcdc-e3c8-483b-a47b-fb38128aa80f","Type":"ContainerStarted","Data":"760ecabcfd6e6dbab2f0943c6e9f4735755ab4e260cf7a5af44a6db3bf8f00ae"} Mar 18 16:30:40 crc kubenswrapper[4696]: I0318 16:30:40.488305 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c25v7" podStartSLOduration=2.986402809 podStartE2EDuration="5.488281581s" podCreationTimestamp="2026-03-18 16:30:35 +0000 UTC" firstStartedPulling="2026-03-18 16:30:37.434435295 +0000 UTC m=+3280.440609501" lastFinishedPulling="2026-03-18 16:30:39.936314067 +0000 UTC m=+3282.942488273" observedRunningTime="2026-03-18 16:30:40.486793273 +0000 UTC m=+3283.492967489" watchObservedRunningTime="2026-03-18 16:30:40.488281581 +0000 UTC m=+3283.494455777" Mar 18 16:30:40 crc kubenswrapper[4696]: I0318 16:30:40.597561 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:30:40 crc kubenswrapper[4696]: E0318 16:30:40.597904 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:30:46 crc kubenswrapper[4696]: I0318 16:30:46.086891 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c25v7" Mar 18 16:30:46 crc kubenswrapper[4696]: I0318 16:30:46.087384 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c25v7" Mar 18 16:30:46 crc kubenswrapper[4696]: I0318 16:30:46.134236 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c25v7" Mar 18 16:30:46 crc kubenswrapper[4696]: I0318 16:30:46.601441 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c25v7" Mar 18 16:30:46 crc kubenswrapper[4696]: I0318 16:30:46.660298 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c25v7"] Mar 18 16:30:48 crc kubenswrapper[4696]: I0318 16:30:48.539159 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c25v7" podUID="0190bcdc-e3c8-483b-a47b-fb38128aa80f" containerName="registry-server" containerID="cri-o://760ecabcfd6e6dbab2f0943c6e9f4735755ab4e260cf7a5af44a6db3bf8f00ae" gracePeriod=2 Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.027725 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c25v7" Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.209118 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddcmm\" (UniqueName: \"kubernetes.io/projected/0190bcdc-e3c8-483b-a47b-fb38128aa80f-kube-api-access-ddcmm\") pod \"0190bcdc-e3c8-483b-a47b-fb38128aa80f\" (UID: \"0190bcdc-e3c8-483b-a47b-fb38128aa80f\") " Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.209763 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0190bcdc-e3c8-483b-a47b-fb38128aa80f-catalog-content\") pod \"0190bcdc-e3c8-483b-a47b-fb38128aa80f\" (UID: \"0190bcdc-e3c8-483b-a47b-fb38128aa80f\") " Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.210099 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0190bcdc-e3c8-483b-a47b-fb38128aa80f-utilities\") pod \"0190bcdc-e3c8-483b-a47b-fb38128aa80f\" (UID: \"0190bcdc-e3c8-483b-a47b-fb38128aa80f\") " Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.211131 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0190bcdc-e3c8-483b-a47b-fb38128aa80f-utilities" (OuterVolumeSpecName: "utilities") pod "0190bcdc-e3c8-483b-a47b-fb38128aa80f" (UID: "0190bcdc-e3c8-483b-a47b-fb38128aa80f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.214609 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0190bcdc-e3c8-483b-a47b-fb38128aa80f-kube-api-access-ddcmm" (OuterVolumeSpecName: "kube-api-access-ddcmm") pod "0190bcdc-e3c8-483b-a47b-fb38128aa80f" (UID: "0190bcdc-e3c8-483b-a47b-fb38128aa80f"). InnerVolumeSpecName "kube-api-access-ddcmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.297897 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0190bcdc-e3c8-483b-a47b-fb38128aa80f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0190bcdc-e3c8-483b-a47b-fb38128aa80f" (UID: "0190bcdc-e3c8-483b-a47b-fb38128aa80f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.317512 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0190bcdc-e3c8-483b-a47b-fb38128aa80f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.317692 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0190bcdc-e3c8-483b-a47b-fb38128aa80f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.317761 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddcmm\" (UniqueName: \"kubernetes.io/projected/0190bcdc-e3c8-483b-a47b-fb38128aa80f-kube-api-access-ddcmm\") on node \"crc\" DevicePath \"\"" Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.552569 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c25v7" Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.552673 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c25v7" event={"ID":"0190bcdc-e3c8-483b-a47b-fb38128aa80f","Type":"ContainerDied","Data":"760ecabcfd6e6dbab2f0943c6e9f4735755ab4e260cf7a5af44a6db3bf8f00ae"} Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.552962 4696 scope.go:117] "RemoveContainer" containerID="760ecabcfd6e6dbab2f0943c6e9f4735755ab4e260cf7a5af44a6db3bf8f00ae" Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.552510 4696 generic.go:334] "Generic (PLEG): container finished" podID="0190bcdc-e3c8-483b-a47b-fb38128aa80f" containerID="760ecabcfd6e6dbab2f0943c6e9f4735755ab4e260cf7a5af44a6db3bf8f00ae" exitCode=0 Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.553038 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c25v7" event={"ID":"0190bcdc-e3c8-483b-a47b-fb38128aa80f","Type":"ContainerDied","Data":"2c59c780be0db38835c36fd241404d7584e8871e326b6d2688bb3109f5d5d0cb"} Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.579389 4696 scope.go:117] "RemoveContainer" containerID="420d282090e4f8feac8f55f1cefabb5cb9e598622b854650bcb4957ee563534b" Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.609900 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c25v7"] Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.609946 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c25v7"] Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.610861 4696 scope.go:117] "RemoveContainer" containerID="826ac7aa5aa6a7a81cb85042e0deba9a6a332c0a167c6a80430de2c0fdf4220e" Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.654910 4696 scope.go:117] "RemoveContainer" containerID="760ecabcfd6e6dbab2f0943c6e9f4735755ab4e260cf7a5af44a6db3bf8f00ae" Mar 18 16:30:49 crc kubenswrapper[4696]: E0318 16:30:49.655428 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"760ecabcfd6e6dbab2f0943c6e9f4735755ab4e260cf7a5af44a6db3bf8f00ae\": container with ID starting with 760ecabcfd6e6dbab2f0943c6e9f4735755ab4e260cf7a5af44a6db3bf8f00ae not found: ID does not exist" containerID="760ecabcfd6e6dbab2f0943c6e9f4735755ab4e260cf7a5af44a6db3bf8f00ae" Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.655468 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"760ecabcfd6e6dbab2f0943c6e9f4735755ab4e260cf7a5af44a6db3bf8f00ae"} err="failed to get container status \"760ecabcfd6e6dbab2f0943c6e9f4735755ab4e260cf7a5af44a6db3bf8f00ae\": rpc error: code = NotFound desc = could not find container \"760ecabcfd6e6dbab2f0943c6e9f4735755ab4e260cf7a5af44a6db3bf8f00ae\": container with ID starting with 760ecabcfd6e6dbab2f0943c6e9f4735755ab4e260cf7a5af44a6db3bf8f00ae not found: ID does not exist" Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.655494 4696 scope.go:117] "RemoveContainer" containerID="420d282090e4f8feac8f55f1cefabb5cb9e598622b854650bcb4957ee563534b" Mar 18 16:30:49 crc kubenswrapper[4696]: E0318 16:30:49.656045 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"420d282090e4f8feac8f55f1cefabb5cb9e598622b854650bcb4957ee563534b\": container with ID starting with 420d282090e4f8feac8f55f1cefabb5cb9e598622b854650bcb4957ee563534b not found: ID does not exist" containerID="420d282090e4f8feac8f55f1cefabb5cb9e598622b854650bcb4957ee563534b" Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.656098 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"420d282090e4f8feac8f55f1cefabb5cb9e598622b854650bcb4957ee563534b"} err="failed to get container status \"420d282090e4f8feac8f55f1cefabb5cb9e598622b854650bcb4957ee563534b\": rpc error: code = NotFound desc = could not find container \"420d282090e4f8feac8f55f1cefabb5cb9e598622b854650bcb4957ee563534b\": container with ID starting with 420d282090e4f8feac8f55f1cefabb5cb9e598622b854650bcb4957ee563534b not found: ID does not exist" Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.656140 4696 scope.go:117] "RemoveContainer" containerID="826ac7aa5aa6a7a81cb85042e0deba9a6a332c0a167c6a80430de2c0fdf4220e" Mar 18 16:30:49 crc kubenswrapper[4696]: E0318 16:30:49.656745 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"826ac7aa5aa6a7a81cb85042e0deba9a6a332c0a167c6a80430de2c0fdf4220e\": container with ID starting with 826ac7aa5aa6a7a81cb85042e0deba9a6a332c0a167c6a80430de2c0fdf4220e not found: ID does not exist" containerID="826ac7aa5aa6a7a81cb85042e0deba9a6a332c0a167c6a80430de2c0fdf4220e" Mar 18 16:30:49 crc kubenswrapper[4696]: I0318 16:30:49.656796 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"826ac7aa5aa6a7a81cb85042e0deba9a6a332c0a167c6a80430de2c0fdf4220e"} err="failed to get container status \"826ac7aa5aa6a7a81cb85042e0deba9a6a332c0a167c6a80430de2c0fdf4220e\": rpc error: code = NotFound desc = could not find container \"826ac7aa5aa6a7a81cb85042e0deba9a6a332c0a167c6a80430de2c0fdf4220e\": container with ID starting with 826ac7aa5aa6a7a81cb85042e0deba9a6a332c0a167c6a80430de2c0fdf4220e not found: ID does not exist" Mar 18 16:30:51 crc kubenswrapper[4696]: I0318 16:30:51.609998 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0190bcdc-e3c8-483b-a47b-fb38128aa80f" path="/var/lib/kubelet/pods/0190bcdc-e3c8-483b-a47b-fb38128aa80f/volumes" Mar 18 16:30:55 crc kubenswrapper[4696]: I0318 16:30:55.597483 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:30:55 crc kubenswrapper[4696]: E0318 16:30:55.598373 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:31:08 crc kubenswrapper[4696]: I0318 16:31:08.597227 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:31:08 crc kubenswrapper[4696]: E0318 16:31:08.598012 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:31:22 crc kubenswrapper[4696]: I0318 16:31:22.597460 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:31:22 crc kubenswrapper[4696]: E0318 16:31:22.598857 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:31:33 crc kubenswrapper[4696]: I0318 16:31:33.597959 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:31:33 crc kubenswrapper[4696]: E0318 16:31:33.598682 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:31:46 crc kubenswrapper[4696]: I0318 16:31:46.597069 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:31:47 crc kubenswrapper[4696]: I0318 16:31:47.080138 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerStarted","Data":"cf86dadce3d1fb257b7fe8784bc21654ea04954f310cfd6357d78c465ecfc986"} Mar 18 16:32:00 crc kubenswrapper[4696]: I0318 16:32:00.139067 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564192-25g4q"] Mar 18 16:32:00 crc kubenswrapper[4696]: E0318 16:32:00.140271 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0190bcdc-e3c8-483b-a47b-fb38128aa80f" containerName="registry-server" Mar 18 16:32:00 crc kubenswrapper[4696]: I0318 16:32:00.140298 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0190bcdc-e3c8-483b-a47b-fb38128aa80f" containerName="registry-server" Mar 18 16:32:00 crc kubenswrapper[4696]: E0318 16:32:00.140326 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0190bcdc-e3c8-483b-a47b-fb38128aa80f" containerName="extract-content" Mar 18 16:32:00 crc kubenswrapper[4696]: I0318 16:32:00.140338 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0190bcdc-e3c8-483b-a47b-fb38128aa80f" containerName="extract-content" Mar 18 16:32:00 crc kubenswrapper[4696]: E0318 16:32:00.140375 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0190bcdc-e3c8-483b-a47b-fb38128aa80f" containerName="extract-utilities" Mar 18 16:32:00 crc kubenswrapper[4696]: I0318 16:32:00.140388 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0190bcdc-e3c8-483b-a47b-fb38128aa80f" containerName="extract-utilities" Mar 18 16:32:00 crc kubenswrapper[4696]: I0318 16:32:00.140761 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="0190bcdc-e3c8-483b-a47b-fb38128aa80f" containerName="registry-server" Mar 18 16:32:00 crc kubenswrapper[4696]: I0318 16:32:00.141836 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564192-25g4q" Mar 18 16:32:00 crc kubenswrapper[4696]: I0318 16:32:00.144027 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:32:00 crc kubenswrapper[4696]: I0318 16:32:00.144122 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:32:00 crc kubenswrapper[4696]: I0318 16:32:00.144215 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:32:00 crc kubenswrapper[4696]: I0318 16:32:00.149972 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564192-25g4q"] Mar 18 16:32:00 crc kubenswrapper[4696]: I0318 16:32:00.297825 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg587\" (UniqueName: \"kubernetes.io/projected/e17e10e7-6226-40f5-a445-4d73fb676335-kube-api-access-kg587\") pod \"auto-csr-approver-29564192-25g4q\" (UID: \"e17e10e7-6226-40f5-a445-4d73fb676335\") " pod="openshift-infra/auto-csr-approver-29564192-25g4q" Mar 18 16:32:00 crc kubenswrapper[4696]: I0318 16:32:00.399920 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg587\" (UniqueName: \"kubernetes.io/projected/e17e10e7-6226-40f5-a445-4d73fb676335-kube-api-access-kg587\") pod \"auto-csr-approver-29564192-25g4q\" (UID: \"e17e10e7-6226-40f5-a445-4d73fb676335\") " pod="openshift-infra/auto-csr-approver-29564192-25g4q" Mar 18 16:32:00 crc kubenswrapper[4696]: I0318 16:32:00.427291 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg587\" (UniqueName: \"kubernetes.io/projected/e17e10e7-6226-40f5-a445-4d73fb676335-kube-api-access-kg587\") pod \"auto-csr-approver-29564192-25g4q\" (UID: \"e17e10e7-6226-40f5-a445-4d73fb676335\") " pod="openshift-infra/auto-csr-approver-29564192-25g4q" Mar 18 16:32:00 crc kubenswrapper[4696]: I0318 16:32:00.463716 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564192-25g4q" Mar 18 16:32:00 crc kubenswrapper[4696]: I0318 16:32:00.884881 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564192-25g4q"] Mar 18 16:32:01 crc kubenswrapper[4696]: I0318 16:32:01.194691 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564192-25g4q" event={"ID":"e17e10e7-6226-40f5-a445-4d73fb676335","Type":"ContainerStarted","Data":"4680a185938e508b48281e831558f7c2232bcb9e86dd0e56a097a3ad24da61af"} Mar 18 16:32:03 crc kubenswrapper[4696]: I0318 16:32:03.216497 4696 generic.go:334] "Generic (PLEG): container finished" podID="e17e10e7-6226-40f5-a445-4d73fb676335" containerID="2005ef9ef26f3cd38fa0714d0b9a8ef267ccb57ebc1cb89d0b7634de8e177985" exitCode=0 Mar 18 16:32:03 crc kubenswrapper[4696]: I0318 16:32:03.216569 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564192-25g4q" event={"ID":"e17e10e7-6226-40f5-a445-4d73fb676335","Type":"ContainerDied","Data":"2005ef9ef26f3cd38fa0714d0b9a8ef267ccb57ebc1cb89d0b7634de8e177985"} Mar 18 16:32:04 crc kubenswrapper[4696]: I0318 16:32:04.627805 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564192-25g4q" Mar 18 16:32:04 crc kubenswrapper[4696]: I0318 16:32:04.782885 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg587\" (UniqueName: \"kubernetes.io/projected/e17e10e7-6226-40f5-a445-4d73fb676335-kube-api-access-kg587\") pod \"e17e10e7-6226-40f5-a445-4d73fb676335\" (UID: \"e17e10e7-6226-40f5-a445-4d73fb676335\") " Mar 18 16:32:04 crc kubenswrapper[4696]: I0318 16:32:04.804040 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e17e10e7-6226-40f5-a445-4d73fb676335-kube-api-access-kg587" (OuterVolumeSpecName: "kube-api-access-kg587") pod "e17e10e7-6226-40f5-a445-4d73fb676335" (UID: "e17e10e7-6226-40f5-a445-4d73fb676335"). InnerVolumeSpecName "kube-api-access-kg587". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:32:04 crc kubenswrapper[4696]: I0318 16:32:04.885369 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg587\" (UniqueName: \"kubernetes.io/projected/e17e10e7-6226-40f5-a445-4d73fb676335-kube-api-access-kg587\") on node \"crc\" DevicePath \"\"" Mar 18 16:32:05 crc kubenswrapper[4696]: I0318 16:32:05.234504 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564192-25g4q" event={"ID":"e17e10e7-6226-40f5-a445-4d73fb676335","Type":"ContainerDied","Data":"4680a185938e508b48281e831558f7c2232bcb9e86dd0e56a097a3ad24da61af"} Mar 18 16:32:05 crc kubenswrapper[4696]: I0318 16:32:05.234583 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4680a185938e508b48281e831558f7c2232bcb9e86dd0e56a097a3ad24da61af" Mar 18 16:32:05 crc kubenswrapper[4696]: I0318 16:32:05.234664 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564192-25g4q" Mar 18 16:32:05 crc kubenswrapper[4696]: I0318 16:32:05.699735 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564186-wbbt8"] Mar 18 16:32:05 crc kubenswrapper[4696]: I0318 16:32:05.707361 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564186-wbbt8"] Mar 18 16:32:07 crc kubenswrapper[4696]: I0318 16:32:07.611766 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62b6ea30-7269-4c46-90ed-a62cf6c99113" path="/var/lib/kubelet/pods/62b6ea30-7269-4c46-90ed-a62cf6c99113/volumes" Mar 18 16:32:18 crc kubenswrapper[4696]: I0318 16:32:18.772251 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sb4zx"] Mar 18 16:32:18 crc kubenswrapper[4696]: E0318 16:32:18.773339 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e17e10e7-6226-40f5-a445-4d73fb676335" containerName="oc" Mar 18 16:32:18 crc kubenswrapper[4696]: I0318 16:32:18.773356 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17e10e7-6226-40f5-a445-4d73fb676335" containerName="oc" Mar 18 16:32:18 crc kubenswrapper[4696]: I0318 16:32:18.773681 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="e17e10e7-6226-40f5-a445-4d73fb676335" containerName="oc" Mar 18 16:32:18 crc kubenswrapper[4696]: I0318 16:32:18.775892 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sb4zx" Mar 18 16:32:18 crc kubenswrapper[4696]: I0318 16:32:18.782771 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sb4zx"] Mar 18 16:32:18 crc kubenswrapper[4696]: I0318 16:32:18.792260 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b73d2de-d188-4d58-84da-b09309189aa1-catalog-content\") pod \"redhat-operators-sb4zx\" (UID: \"7b73d2de-d188-4d58-84da-b09309189aa1\") " pod="openshift-marketplace/redhat-operators-sb4zx" Mar 18 16:32:18 crc kubenswrapper[4696]: I0318 16:32:18.792541 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx8ps\" (UniqueName: \"kubernetes.io/projected/7b73d2de-d188-4d58-84da-b09309189aa1-kube-api-access-qx8ps\") pod \"redhat-operators-sb4zx\" (UID: \"7b73d2de-d188-4d58-84da-b09309189aa1\") " pod="openshift-marketplace/redhat-operators-sb4zx" Mar 18 16:32:18 crc kubenswrapper[4696]: I0318 16:32:18.792629 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b73d2de-d188-4d58-84da-b09309189aa1-utilities\") pod \"redhat-operators-sb4zx\" (UID: \"7b73d2de-d188-4d58-84da-b09309189aa1\") " pod="openshift-marketplace/redhat-operators-sb4zx" Mar 18 16:32:18 crc kubenswrapper[4696]: I0318 16:32:18.893866 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b73d2de-d188-4d58-84da-b09309189aa1-utilities\") pod \"redhat-operators-sb4zx\" (UID: \"7b73d2de-d188-4d58-84da-b09309189aa1\") " pod="openshift-marketplace/redhat-operators-sb4zx" Mar 18 16:32:18 crc kubenswrapper[4696]: I0318 16:32:18.894062 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b73d2de-d188-4d58-84da-b09309189aa1-catalog-content\") pod \"redhat-operators-sb4zx\" (UID: \"7b73d2de-d188-4d58-84da-b09309189aa1\") " pod="openshift-marketplace/redhat-operators-sb4zx" Mar 18 16:32:18 crc kubenswrapper[4696]: I0318 16:32:18.894596 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b73d2de-d188-4d58-84da-b09309189aa1-utilities\") pod \"redhat-operators-sb4zx\" (UID: \"7b73d2de-d188-4d58-84da-b09309189aa1\") " pod="openshift-marketplace/redhat-operators-sb4zx" Mar 18 16:32:18 crc kubenswrapper[4696]: I0318 16:32:18.894824 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx8ps\" (UniqueName: \"kubernetes.io/projected/7b73d2de-d188-4d58-84da-b09309189aa1-kube-api-access-qx8ps\") pod \"redhat-operators-sb4zx\" (UID: \"7b73d2de-d188-4d58-84da-b09309189aa1\") " pod="openshift-marketplace/redhat-operators-sb4zx" Mar 18 16:32:18 crc kubenswrapper[4696]: I0318 16:32:18.894902 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b73d2de-d188-4d58-84da-b09309189aa1-catalog-content\") pod \"redhat-operators-sb4zx\" (UID: \"7b73d2de-d188-4d58-84da-b09309189aa1\") " pod="openshift-marketplace/redhat-operators-sb4zx" Mar 18 16:32:18 crc kubenswrapper[4696]: I0318 16:32:18.924374 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx8ps\" (UniqueName: \"kubernetes.io/projected/7b73d2de-d188-4d58-84da-b09309189aa1-kube-api-access-qx8ps\") pod \"redhat-operators-sb4zx\" (UID: \"7b73d2de-d188-4d58-84da-b09309189aa1\") " pod="openshift-marketplace/redhat-operators-sb4zx" Mar 18 16:32:19 crc kubenswrapper[4696]: I0318 16:32:19.105839 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sb4zx" Mar 18 16:32:19 crc kubenswrapper[4696]: I0318 16:32:19.566804 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sb4zx"] Mar 18 16:32:20 crc kubenswrapper[4696]: I0318 16:32:20.369004 4696 generic.go:334] "Generic (PLEG): container finished" podID="7b73d2de-d188-4d58-84da-b09309189aa1" containerID="9d717a8d66690bebf287afb967cff197820b3740040f888a5636107e02a44519" exitCode=0 Mar 18 16:32:20 crc kubenswrapper[4696]: I0318 16:32:20.369124 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sb4zx" event={"ID":"7b73d2de-d188-4d58-84da-b09309189aa1","Type":"ContainerDied","Data":"9d717a8d66690bebf287afb967cff197820b3740040f888a5636107e02a44519"} Mar 18 16:32:20 crc kubenswrapper[4696]: I0318 16:32:20.369354 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sb4zx" event={"ID":"7b73d2de-d188-4d58-84da-b09309189aa1","Type":"ContainerStarted","Data":"ad667eb087c8b54a0a649aefd7f205ca3ca083dd0148f8df7dcb91ebeb1e33ee"} Mar 18 16:32:24 crc kubenswrapper[4696]: I0318 16:32:24.381217 4696 scope.go:117] "RemoveContainer" containerID="91e8d0fab6492e0c99653f16429f4d95802608e433e63e037c93c3791146cefa" Mar 18 16:32:30 crc kubenswrapper[4696]: I0318 16:32:30.472270 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sb4zx" event={"ID":"7b73d2de-d188-4d58-84da-b09309189aa1","Type":"ContainerStarted","Data":"82f905679f851d3edac215bf1d6fffac898f501def15fb5fa19d6b0348b56924"} Mar 18 16:32:31 crc kubenswrapper[4696]: I0318 16:32:31.484715 4696 generic.go:334] "Generic (PLEG): container finished" podID="7b73d2de-d188-4d58-84da-b09309189aa1" containerID="82f905679f851d3edac215bf1d6fffac898f501def15fb5fa19d6b0348b56924" exitCode=0 Mar 18 16:32:31 crc kubenswrapper[4696]: I0318 16:32:31.485091 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sb4zx" event={"ID":"7b73d2de-d188-4d58-84da-b09309189aa1","Type":"ContainerDied","Data":"82f905679f851d3edac215bf1d6fffac898f501def15fb5fa19d6b0348b56924"} Mar 18 16:32:42 crc kubenswrapper[4696]: I0318 16:32:42.902068 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="e2ee43b8-090b-4daf-907b-9a21c3986e42" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.173:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:32:45 crc kubenswrapper[4696]: I0318 16:32:45.615866 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sb4zx" event={"ID":"7b73d2de-d188-4d58-84da-b09309189aa1","Type":"ContainerStarted","Data":"5d663ce8c36e8d3c49769f4b17bdf43d2b909b40618af78dd66b1929ac827942"} Mar 18 16:32:45 crc kubenswrapper[4696]: I0318 16:32:45.637730 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sb4zx" podStartSLOduration=3.114454413 podStartE2EDuration="27.637710996s" podCreationTimestamp="2026-03-18 16:32:18 +0000 UTC" firstStartedPulling="2026-03-18 16:32:20.371116373 +0000 UTC m=+3383.377290579" lastFinishedPulling="2026-03-18 16:32:44.894372956 +0000 UTC m=+3407.900547162" observedRunningTime="2026-03-18 16:32:45.63349004 +0000 UTC m=+3408.639664256" watchObservedRunningTime="2026-03-18 16:32:45.637710996 +0000 UTC m=+3408.643885202" Mar 18 16:32:49 crc kubenswrapper[4696]: I0318 16:32:49.106612 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sb4zx" Mar 18 16:32:49 crc kubenswrapper[4696]: I0318 16:32:49.107198 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sb4zx" Mar 18 16:32:50 crc kubenswrapper[4696]: I0318 16:32:50.161961 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sb4zx" podUID="7b73d2de-d188-4d58-84da-b09309189aa1" containerName="registry-server" probeResult="failure" output=< Mar 18 16:32:50 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Mar 18 16:32:50 crc kubenswrapper[4696]: > Mar 18 16:32:59 crc kubenswrapper[4696]: I0318 16:32:59.154429 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sb4zx" Mar 18 16:32:59 crc kubenswrapper[4696]: I0318 16:32:59.204824 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sb4zx" Mar 18 16:32:59 crc kubenswrapper[4696]: I0318 16:32:59.268337 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sb4zx"] Mar 18 16:32:59 crc kubenswrapper[4696]: I0318 16:32:59.397968 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sn9br"] Mar 18 16:32:59 crc kubenswrapper[4696]: I0318 16:32:59.398239 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sn9br" podUID="318897ee-25bb-4784-ab4a-a09877ea0922" containerName="registry-server" containerID="cri-o://c0620558b993c6d7c4bfecdd4c009f34fcf85f5c8eacf94ac2dda90eb884f6d3" gracePeriod=2 Mar 18 16:32:59 crc kubenswrapper[4696]: I0318 16:32:59.748394 4696 generic.go:334] "Generic (PLEG): container finished" podID="318897ee-25bb-4784-ab4a-a09877ea0922" containerID="c0620558b993c6d7c4bfecdd4c009f34fcf85f5c8eacf94ac2dda90eb884f6d3" exitCode=0 Mar 18 16:32:59 crc kubenswrapper[4696]: I0318 16:32:59.749751 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn9br" event={"ID":"318897ee-25bb-4784-ab4a-a09877ea0922","Type":"ContainerDied","Data":"c0620558b993c6d7c4bfecdd4c009f34fcf85f5c8eacf94ac2dda90eb884f6d3"} Mar 18 16:32:59 crc kubenswrapper[4696]: I0318 16:32:59.968121 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sn9br" Mar 18 16:33:00 crc kubenswrapper[4696]: I0318 16:33:00.104845 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7tns\" (UniqueName: \"kubernetes.io/projected/318897ee-25bb-4784-ab4a-a09877ea0922-kube-api-access-r7tns\") pod \"318897ee-25bb-4784-ab4a-a09877ea0922\" (UID: \"318897ee-25bb-4784-ab4a-a09877ea0922\") " Mar 18 16:33:00 crc kubenswrapper[4696]: I0318 16:33:00.104914 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/318897ee-25bb-4784-ab4a-a09877ea0922-utilities\") pod \"318897ee-25bb-4784-ab4a-a09877ea0922\" (UID: \"318897ee-25bb-4784-ab4a-a09877ea0922\") " Mar 18 16:33:00 crc kubenswrapper[4696]: I0318 16:33:00.105051 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/318897ee-25bb-4784-ab4a-a09877ea0922-catalog-content\") pod \"318897ee-25bb-4784-ab4a-a09877ea0922\" (UID: \"318897ee-25bb-4784-ab4a-a09877ea0922\") " Mar 18 16:33:00 crc kubenswrapper[4696]: I0318 16:33:00.105879 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/318897ee-25bb-4784-ab4a-a09877ea0922-utilities" (OuterVolumeSpecName: "utilities") pod "318897ee-25bb-4784-ab4a-a09877ea0922" (UID: "318897ee-25bb-4784-ab4a-a09877ea0922"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:33:00 crc kubenswrapper[4696]: I0318 16:33:00.112729 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/318897ee-25bb-4784-ab4a-a09877ea0922-kube-api-access-r7tns" (OuterVolumeSpecName: "kube-api-access-r7tns") pod "318897ee-25bb-4784-ab4a-a09877ea0922" (UID: "318897ee-25bb-4784-ab4a-a09877ea0922"). InnerVolumeSpecName "kube-api-access-r7tns". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:33:00 crc kubenswrapper[4696]: I0318 16:33:00.207622 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7tns\" (UniqueName: \"kubernetes.io/projected/318897ee-25bb-4784-ab4a-a09877ea0922-kube-api-access-r7tns\") on node \"crc\" DevicePath \"\"" Mar 18 16:33:00 crc kubenswrapper[4696]: I0318 16:33:00.207953 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/318897ee-25bb-4784-ab4a-a09877ea0922-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:33:00 crc kubenswrapper[4696]: I0318 16:33:00.231519 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/318897ee-25bb-4784-ab4a-a09877ea0922-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "318897ee-25bb-4784-ab4a-a09877ea0922" (UID: "318897ee-25bb-4784-ab4a-a09877ea0922"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:33:00 crc kubenswrapper[4696]: I0318 16:33:00.309232 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/318897ee-25bb-4784-ab4a-a09877ea0922-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:33:00 crc kubenswrapper[4696]: I0318 16:33:00.758632 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sn9br" Mar 18 16:33:00 crc kubenswrapper[4696]: I0318 16:33:00.758690 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn9br" event={"ID":"318897ee-25bb-4784-ab4a-a09877ea0922","Type":"ContainerDied","Data":"79bf6ef232a4afe65a5ffe664907cbd9ab23b7b5af433c7a5b6d535957445cb8"} Mar 18 16:33:00 crc kubenswrapper[4696]: I0318 16:33:00.758722 4696 scope.go:117] "RemoveContainer" containerID="c0620558b993c6d7c4bfecdd4c009f34fcf85f5c8eacf94ac2dda90eb884f6d3" Mar 18 16:33:00 crc kubenswrapper[4696]: I0318 16:33:00.784869 4696 scope.go:117] "RemoveContainer" containerID="99f81704dad8895dd2ab79226f57b9f1f4ead97f074cccb34122a29a5059406c" Mar 18 16:33:00 crc kubenswrapper[4696]: I0318 16:33:00.793376 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sn9br"] Mar 18 16:33:00 crc kubenswrapper[4696]: I0318 16:33:00.804289 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sn9br"] Mar 18 16:33:00 crc kubenswrapper[4696]: I0318 16:33:00.816945 4696 scope.go:117] "RemoveContainer" containerID="3f323b8d3626df1b761d989eaf7866aea12dd5bb6e01318c608bacabb71ae046" Mar 18 16:33:01 crc kubenswrapper[4696]: I0318 16:33:01.608120 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="318897ee-25bb-4784-ab4a-a09877ea0922" path="/var/lib/kubelet/pods/318897ee-25bb-4784-ab4a-a09877ea0922/volumes" Mar 18 16:33:44 crc kubenswrapper[4696]: I0318 16:33:44.987826 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fqqpj"] Mar 18 16:33:44 crc kubenswrapper[4696]: E0318 16:33:44.990015 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318897ee-25bb-4784-ab4a-a09877ea0922" containerName="extract-content" Mar 18 16:33:44 crc kubenswrapper[4696]: I0318 16:33:44.990118 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="318897ee-25bb-4784-ab4a-a09877ea0922" containerName="extract-content" Mar 18 16:33:44 crc kubenswrapper[4696]: E0318 16:33:44.990202 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318897ee-25bb-4784-ab4a-a09877ea0922" containerName="extract-utilities" Mar 18 16:33:44 crc kubenswrapper[4696]: I0318 16:33:44.990282 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="318897ee-25bb-4784-ab4a-a09877ea0922" containerName="extract-utilities" Mar 18 16:33:44 crc kubenswrapper[4696]: E0318 16:33:44.990367 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318897ee-25bb-4784-ab4a-a09877ea0922" containerName="registry-server" Mar 18 16:33:44 crc kubenswrapper[4696]: I0318 16:33:44.990430 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="318897ee-25bb-4784-ab4a-a09877ea0922" containerName="registry-server" Mar 18 16:33:44 crc kubenswrapper[4696]: I0318 16:33:44.990750 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="318897ee-25bb-4784-ab4a-a09877ea0922" containerName="registry-server" Mar 18 16:33:44 crc kubenswrapper[4696]: I0318 16:33:44.992594 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fqqpj" Mar 18 16:33:44 crc kubenswrapper[4696]: I0318 16:33:44.997779 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fqqpj"] Mar 18 16:33:45 crc kubenswrapper[4696]: I0318 16:33:45.139764 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08e6e198-9c73-4158-b5f7-4f2aa39d3972-utilities\") pod \"community-operators-fqqpj\" (UID: \"08e6e198-9c73-4158-b5f7-4f2aa39d3972\") " pod="openshift-marketplace/community-operators-fqqpj" Mar 18 16:33:45 crc kubenswrapper[4696]: I0318 16:33:45.139834 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mrm7\" (UniqueName: \"kubernetes.io/projected/08e6e198-9c73-4158-b5f7-4f2aa39d3972-kube-api-access-8mrm7\") pod \"community-operators-fqqpj\" (UID: \"08e6e198-9c73-4158-b5f7-4f2aa39d3972\") " pod="openshift-marketplace/community-operators-fqqpj" Mar 18 16:33:45 crc kubenswrapper[4696]: I0318 16:33:45.139956 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08e6e198-9c73-4158-b5f7-4f2aa39d3972-catalog-content\") pod \"community-operators-fqqpj\" (UID: \"08e6e198-9c73-4158-b5f7-4f2aa39d3972\") " pod="openshift-marketplace/community-operators-fqqpj" Mar 18 16:33:45 crc kubenswrapper[4696]: I0318 16:33:45.241115 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08e6e198-9c73-4158-b5f7-4f2aa39d3972-catalog-content\") pod \"community-operators-fqqpj\" (UID: \"08e6e198-9c73-4158-b5f7-4f2aa39d3972\") " pod="openshift-marketplace/community-operators-fqqpj" Mar 18 16:33:45 crc kubenswrapper[4696]: I0318 16:33:45.241186 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08e6e198-9c73-4158-b5f7-4f2aa39d3972-utilities\") pod \"community-operators-fqqpj\" (UID: \"08e6e198-9c73-4158-b5f7-4f2aa39d3972\") " pod="openshift-marketplace/community-operators-fqqpj" Mar 18 16:33:45 crc kubenswrapper[4696]: I0318 16:33:45.241229 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mrm7\" (UniqueName: \"kubernetes.io/projected/08e6e198-9c73-4158-b5f7-4f2aa39d3972-kube-api-access-8mrm7\") pod \"community-operators-fqqpj\" (UID: \"08e6e198-9c73-4158-b5f7-4f2aa39d3972\") " pod="openshift-marketplace/community-operators-fqqpj" Mar 18 16:33:45 crc kubenswrapper[4696]: I0318 16:33:45.241876 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08e6e198-9c73-4158-b5f7-4f2aa39d3972-catalog-content\") pod \"community-operators-fqqpj\" (UID: \"08e6e198-9c73-4158-b5f7-4f2aa39d3972\") " pod="openshift-marketplace/community-operators-fqqpj" Mar 18 16:33:45 crc kubenswrapper[4696]: I0318 16:33:45.241919 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08e6e198-9c73-4158-b5f7-4f2aa39d3972-utilities\") pod \"community-operators-fqqpj\" (UID: \"08e6e198-9c73-4158-b5f7-4f2aa39d3972\") " pod="openshift-marketplace/community-operators-fqqpj" Mar 18 16:33:45 crc kubenswrapper[4696]: I0318 16:33:45.260074 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mrm7\" (UniqueName: \"kubernetes.io/projected/08e6e198-9c73-4158-b5f7-4f2aa39d3972-kube-api-access-8mrm7\") pod \"community-operators-fqqpj\" (UID: \"08e6e198-9c73-4158-b5f7-4f2aa39d3972\") " pod="openshift-marketplace/community-operators-fqqpj" Mar 18 16:33:45 crc kubenswrapper[4696]: I0318 16:33:45.318676 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fqqpj" Mar 18 16:33:45 crc kubenswrapper[4696]: I0318 16:33:45.886346 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fqqpj"] Mar 18 16:33:46 crc kubenswrapper[4696]: I0318 16:33:46.349784 4696 generic.go:334] "Generic (PLEG): container finished" podID="08e6e198-9c73-4158-b5f7-4f2aa39d3972" containerID="5df637ce18d1c44dcbb398762c9a90f616e03645abaee0dc03ea16637a9f95ce" exitCode=0 Mar 18 16:33:46 crc kubenswrapper[4696]: I0318 16:33:46.349899 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqqpj" event={"ID":"08e6e198-9c73-4158-b5f7-4f2aa39d3972","Type":"ContainerDied","Data":"5df637ce18d1c44dcbb398762c9a90f616e03645abaee0dc03ea16637a9f95ce"} Mar 18 16:33:46 crc kubenswrapper[4696]: I0318 16:33:46.350102 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqqpj" event={"ID":"08e6e198-9c73-4158-b5f7-4f2aa39d3972","Type":"ContainerStarted","Data":"0b2a3bb998ca1aa1b5fcaa7392bdfbb7e09f5c5e9d671c08a1e4b6f869103f1f"} Mar 18 16:33:48 crc kubenswrapper[4696]: I0318 16:33:48.811177 4696 generic.go:334] "Generic (PLEG): container finished" podID="08e6e198-9c73-4158-b5f7-4f2aa39d3972" containerID="828e1e2873453d017135e43f231af5825193bccc342187a7dac65c37e6edb36e" exitCode=0 Mar 18 16:33:48 crc kubenswrapper[4696]: I0318 16:33:48.811242 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqqpj" event={"ID":"08e6e198-9c73-4158-b5f7-4f2aa39d3972","Type":"ContainerDied","Data":"828e1e2873453d017135e43f231af5825193bccc342187a7dac65c37e6edb36e"} Mar 18 16:33:50 crc kubenswrapper[4696]: I0318 16:33:50.676224 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ldwb6"] Mar 18 16:33:50 crc kubenswrapper[4696]: I0318 16:33:50.681327 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ldwb6" Mar 18 16:33:50 crc kubenswrapper[4696]: I0318 16:33:50.689090 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ldwb6"] Mar 18 16:33:50 crc kubenswrapper[4696]: I0318 16:33:50.725744 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrlmg\" (UniqueName: \"kubernetes.io/projected/f077bd55-dda3-4ce7-b5a5-01bbbb65cb06-kube-api-access-nrlmg\") pod \"certified-operators-ldwb6\" (UID: \"f077bd55-dda3-4ce7-b5a5-01bbbb65cb06\") " pod="openshift-marketplace/certified-operators-ldwb6" Mar 18 16:33:50 crc kubenswrapper[4696]: I0318 16:33:50.725944 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f077bd55-dda3-4ce7-b5a5-01bbbb65cb06-utilities\") pod \"certified-operators-ldwb6\" (UID: \"f077bd55-dda3-4ce7-b5a5-01bbbb65cb06\") " pod="openshift-marketplace/certified-operators-ldwb6" Mar 18 16:33:50 crc kubenswrapper[4696]: I0318 16:33:50.725999 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f077bd55-dda3-4ce7-b5a5-01bbbb65cb06-catalog-content\") pod \"certified-operators-ldwb6\" (UID: \"f077bd55-dda3-4ce7-b5a5-01bbbb65cb06\") " pod="openshift-marketplace/certified-operators-ldwb6" Mar 18 16:33:50 crc kubenswrapper[4696]: I0318 16:33:50.828000 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f077bd55-dda3-4ce7-b5a5-01bbbb65cb06-utilities\") pod \"certified-operators-ldwb6\" (UID: \"f077bd55-dda3-4ce7-b5a5-01bbbb65cb06\") " pod="openshift-marketplace/certified-operators-ldwb6" Mar 18 16:33:50 crc kubenswrapper[4696]: I0318 16:33:50.828309 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f077bd55-dda3-4ce7-b5a5-01bbbb65cb06-catalog-content\") pod \"certified-operators-ldwb6\" (UID: \"f077bd55-dda3-4ce7-b5a5-01bbbb65cb06\") " pod="openshift-marketplace/certified-operators-ldwb6" Mar 18 16:33:50 crc kubenswrapper[4696]: I0318 16:33:50.828356 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrlmg\" (UniqueName: \"kubernetes.io/projected/f077bd55-dda3-4ce7-b5a5-01bbbb65cb06-kube-api-access-nrlmg\") pod \"certified-operators-ldwb6\" (UID: \"f077bd55-dda3-4ce7-b5a5-01bbbb65cb06\") " pod="openshift-marketplace/certified-operators-ldwb6" Mar 18 16:33:50 crc kubenswrapper[4696]: I0318 16:33:50.828534 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f077bd55-dda3-4ce7-b5a5-01bbbb65cb06-utilities\") pod \"certified-operators-ldwb6\" (UID: \"f077bd55-dda3-4ce7-b5a5-01bbbb65cb06\") " pod="openshift-marketplace/certified-operators-ldwb6" Mar 18 16:33:50 crc kubenswrapper[4696]: I0318 16:33:50.828915 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f077bd55-dda3-4ce7-b5a5-01bbbb65cb06-catalog-content\") pod \"certified-operators-ldwb6\" (UID: \"f077bd55-dda3-4ce7-b5a5-01bbbb65cb06\") " pod="openshift-marketplace/certified-operators-ldwb6" Mar 18 16:33:50 crc kubenswrapper[4696]: I0318 16:33:50.848026 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrlmg\" (UniqueName: \"kubernetes.io/projected/f077bd55-dda3-4ce7-b5a5-01bbbb65cb06-kube-api-access-nrlmg\") pod \"certified-operators-ldwb6\" (UID: \"f077bd55-dda3-4ce7-b5a5-01bbbb65cb06\") " pod="openshift-marketplace/certified-operators-ldwb6" Mar 18 16:33:51 crc kubenswrapper[4696]: I0318 16:33:51.004589 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ldwb6" Mar 18 16:33:51 crc kubenswrapper[4696]: I0318 16:33:51.622242 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ldwb6"] Mar 18 16:33:51 crc kubenswrapper[4696]: I0318 16:33:51.845642 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ldwb6" event={"ID":"f077bd55-dda3-4ce7-b5a5-01bbbb65cb06","Type":"ContainerStarted","Data":"808ea17e7f3d85cb99004c909be8710df1aee23cce76e6932227eb8b8f70fe82"} Mar 18 16:33:51 crc kubenswrapper[4696]: I0318 16:33:51.846002 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ldwb6" event={"ID":"f077bd55-dda3-4ce7-b5a5-01bbbb65cb06","Type":"ContainerStarted","Data":"dadec6c049c9e14958892c8eff26411e127b5f72ef85e200f224a940e588c602"} Mar 18 16:33:51 crc kubenswrapper[4696]: I0318 16:33:51.851196 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqqpj" event={"ID":"08e6e198-9c73-4158-b5f7-4f2aa39d3972","Type":"ContainerStarted","Data":"606c26a344520b25c53a1047dd793c94782244665c62efd5d8990edb329eed42"} Mar 18 16:33:51 crc kubenswrapper[4696]: I0318 16:33:51.876063 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fqqpj" podStartSLOduration=3.415906176 podStartE2EDuration="7.876042434s" podCreationTimestamp="2026-03-18 16:33:44 +0000 UTC" firstStartedPulling="2026-03-18 16:33:46.351924151 +0000 UTC m=+3469.358098357" lastFinishedPulling="2026-03-18 16:33:50.812060369 +0000 UTC m=+3473.818234615" observedRunningTime="2026-03-18 16:33:51.869008917 +0000 UTC m=+3474.875183123" watchObservedRunningTime="2026-03-18 16:33:51.876042434 +0000 UTC m=+3474.882216640" Mar 18 16:33:52 crc kubenswrapper[4696]: I0318 16:33:52.863416 4696 generic.go:334] "Generic (PLEG): container finished" podID="f077bd55-dda3-4ce7-b5a5-01bbbb65cb06" containerID="808ea17e7f3d85cb99004c909be8710df1aee23cce76e6932227eb8b8f70fe82" exitCode=0 Mar 18 16:33:52 crc kubenswrapper[4696]: I0318 16:33:52.863564 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ldwb6" event={"ID":"f077bd55-dda3-4ce7-b5a5-01bbbb65cb06","Type":"ContainerDied","Data":"808ea17e7f3d85cb99004c909be8710df1aee23cce76e6932227eb8b8f70fe82"} Mar 18 16:33:55 crc kubenswrapper[4696]: I0318 16:33:55.320725 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fqqpj" Mar 18 16:33:55 crc kubenswrapper[4696]: I0318 16:33:55.321319 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fqqpj" Mar 18 16:33:55 crc kubenswrapper[4696]: I0318 16:33:55.379153 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fqqpj" Mar 18 16:33:55 crc kubenswrapper[4696]: I0318 16:33:55.893369 4696 generic.go:334] "Generic (PLEG): container finished" podID="f077bd55-dda3-4ce7-b5a5-01bbbb65cb06" containerID="336685fec5e932c4e19e13d9c7e559a6e476583631d280ee61bbf3c8554804be" exitCode=0 Mar 18 16:33:55 crc kubenswrapper[4696]: I0318 16:33:55.893462 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ldwb6" event={"ID":"f077bd55-dda3-4ce7-b5a5-01bbbb65cb06","Type":"ContainerDied","Data":"336685fec5e932c4e19e13d9c7e559a6e476583631d280ee61bbf3c8554804be"} Mar 18 16:33:56 crc kubenswrapper[4696]: I0318 16:33:56.902731 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ldwb6" event={"ID":"f077bd55-dda3-4ce7-b5a5-01bbbb65cb06","Type":"ContainerStarted","Data":"4dc5890161402f7223862da74fb9781f58542640ba6ea1434fc39ec1ad7f2298"} Mar 18 16:33:56 crc kubenswrapper[4696]: I0318 16:33:56.931000 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ldwb6" podStartSLOduration=3.504401664 podStartE2EDuration="6.93097974s" podCreationTimestamp="2026-03-18 16:33:50 +0000 UTC" firstStartedPulling="2026-03-18 16:33:52.866644086 +0000 UTC m=+3475.872818292" lastFinishedPulling="2026-03-18 16:33:56.293222162 +0000 UTC m=+3479.299396368" observedRunningTime="2026-03-18 16:33:56.922819616 +0000 UTC m=+3479.928993822" watchObservedRunningTime="2026-03-18 16:33:56.93097974 +0000 UTC m=+3479.937153946" Mar 18 16:34:00 crc kubenswrapper[4696]: I0318 16:34:00.138642 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564194-662zt"] Mar 18 16:34:00 crc kubenswrapper[4696]: I0318 16:34:00.141213 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564194-662zt" Mar 18 16:34:00 crc kubenswrapper[4696]: I0318 16:34:00.145917 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:34:00 crc kubenswrapper[4696]: I0318 16:34:00.146018 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:34:00 crc kubenswrapper[4696]: I0318 16:34:00.150172 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564194-662zt"] Mar 18 16:34:00 crc kubenswrapper[4696]: I0318 16:34:00.150855 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:34:00 crc kubenswrapper[4696]: I0318 16:34:00.222876 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2bbj\" (UniqueName: \"kubernetes.io/projected/5fa66a8b-5bc8-4701-8d68-2270067943a6-kube-api-access-b2bbj\") pod \"auto-csr-approver-29564194-662zt\" (UID: \"5fa66a8b-5bc8-4701-8d68-2270067943a6\") " pod="openshift-infra/auto-csr-approver-29564194-662zt" Mar 18 16:34:00 crc kubenswrapper[4696]: I0318 16:34:00.325721 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2bbj\" (UniqueName: \"kubernetes.io/projected/5fa66a8b-5bc8-4701-8d68-2270067943a6-kube-api-access-b2bbj\") pod \"auto-csr-approver-29564194-662zt\" (UID: \"5fa66a8b-5bc8-4701-8d68-2270067943a6\") " pod="openshift-infra/auto-csr-approver-29564194-662zt" Mar 18 16:34:00 crc kubenswrapper[4696]: I0318 16:34:00.345477 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2bbj\" (UniqueName: \"kubernetes.io/projected/5fa66a8b-5bc8-4701-8d68-2270067943a6-kube-api-access-b2bbj\") pod \"auto-csr-approver-29564194-662zt\" (UID: \"5fa66a8b-5bc8-4701-8d68-2270067943a6\") " pod="openshift-infra/auto-csr-approver-29564194-662zt" Mar 18 16:34:00 crc kubenswrapper[4696]: I0318 16:34:00.461766 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564194-662zt" Mar 18 16:34:00 crc kubenswrapper[4696]: I0318 16:34:00.901212 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564194-662zt"] Mar 18 16:34:00 crc kubenswrapper[4696]: I0318 16:34:00.940508 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564194-662zt" event={"ID":"5fa66a8b-5bc8-4701-8d68-2270067943a6","Type":"ContainerStarted","Data":"65cf62db699f51d0e5bc695eb620d7fd5c92c296839d59d6058fd8c94828986f"} Mar 18 16:34:01 crc kubenswrapper[4696]: I0318 16:34:01.004761 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ldwb6" Mar 18 16:34:01 crc kubenswrapper[4696]: I0318 16:34:01.004830 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ldwb6" Mar 18 16:34:01 crc kubenswrapper[4696]: I0318 16:34:01.056158 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ldwb6" Mar 18 16:34:02 crc kubenswrapper[4696]: I0318 16:34:02.000051 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ldwb6" Mar 18 16:34:02 crc kubenswrapper[4696]: I0318 16:34:02.048235 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ldwb6"] Mar 18 16:34:02 crc kubenswrapper[4696]: I0318 16:34:02.956415 4696 generic.go:334] "Generic (PLEG): container finished" podID="5fa66a8b-5bc8-4701-8d68-2270067943a6" containerID="ac5cd905e7460a32d079207deaef6bf06148fb1871f95dac27f9c7f3598d56d5" exitCode=0 Mar 18 16:34:02 crc kubenswrapper[4696]: I0318 16:34:02.956471 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564194-662zt" event={"ID":"5fa66a8b-5bc8-4701-8d68-2270067943a6","Type":"ContainerDied","Data":"ac5cd905e7460a32d079207deaef6bf06148fb1871f95dac27f9c7f3598d56d5"} Mar 18 16:34:03 crc kubenswrapper[4696]: I0318 16:34:03.964816 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ldwb6" podUID="f077bd55-dda3-4ce7-b5a5-01bbbb65cb06" containerName="registry-server" containerID="cri-o://4dc5890161402f7223862da74fb9781f58542640ba6ea1434fc39ec1ad7f2298" gracePeriod=2 Mar 18 16:34:04 crc kubenswrapper[4696]: I0318 16:34:04.354610 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564194-662zt" Mar 18 16:34:04 crc kubenswrapper[4696]: I0318 16:34:04.404576 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2bbj\" (UniqueName: \"kubernetes.io/projected/5fa66a8b-5bc8-4701-8d68-2270067943a6-kube-api-access-b2bbj\") pod \"5fa66a8b-5bc8-4701-8d68-2270067943a6\" (UID: \"5fa66a8b-5bc8-4701-8d68-2270067943a6\") " Mar 18 16:34:04 crc kubenswrapper[4696]: I0318 16:34:04.411833 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fa66a8b-5bc8-4701-8d68-2270067943a6-kube-api-access-b2bbj" (OuterVolumeSpecName: "kube-api-access-b2bbj") pod "5fa66a8b-5bc8-4701-8d68-2270067943a6" (UID: "5fa66a8b-5bc8-4701-8d68-2270067943a6"). InnerVolumeSpecName "kube-api-access-b2bbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:34:04 crc kubenswrapper[4696]: I0318 16:34:04.477981 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ldwb6" Mar 18 16:34:04 crc kubenswrapper[4696]: I0318 16:34:04.506669 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2bbj\" (UniqueName: \"kubernetes.io/projected/5fa66a8b-5bc8-4701-8d68-2270067943a6-kube-api-access-b2bbj\") on node \"crc\" DevicePath \"\"" Mar 18 16:34:04 crc kubenswrapper[4696]: I0318 16:34:04.607856 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrlmg\" (UniqueName: \"kubernetes.io/projected/f077bd55-dda3-4ce7-b5a5-01bbbb65cb06-kube-api-access-nrlmg\") pod \"f077bd55-dda3-4ce7-b5a5-01bbbb65cb06\" (UID: \"f077bd55-dda3-4ce7-b5a5-01bbbb65cb06\") " Mar 18 16:34:04 crc kubenswrapper[4696]: I0318 16:34:04.608213 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f077bd55-dda3-4ce7-b5a5-01bbbb65cb06-catalog-content\") pod \"f077bd55-dda3-4ce7-b5a5-01bbbb65cb06\" (UID: \"f077bd55-dda3-4ce7-b5a5-01bbbb65cb06\") " Mar 18 16:34:04 crc kubenswrapper[4696]: I0318 16:34:04.608395 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f077bd55-dda3-4ce7-b5a5-01bbbb65cb06-utilities\") pod \"f077bd55-dda3-4ce7-b5a5-01bbbb65cb06\" (UID: \"f077bd55-dda3-4ce7-b5a5-01bbbb65cb06\") " Mar 18 16:34:04 crc kubenswrapper[4696]: I0318 16:34:04.609043 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f077bd55-dda3-4ce7-b5a5-01bbbb65cb06-utilities" (OuterVolumeSpecName: "utilities") pod "f077bd55-dda3-4ce7-b5a5-01bbbb65cb06" (UID: "f077bd55-dda3-4ce7-b5a5-01bbbb65cb06"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:34:04 crc kubenswrapper[4696]: I0318 16:34:04.611502 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f077bd55-dda3-4ce7-b5a5-01bbbb65cb06-kube-api-access-nrlmg" (OuterVolumeSpecName: "kube-api-access-nrlmg") pod "f077bd55-dda3-4ce7-b5a5-01bbbb65cb06" (UID: "f077bd55-dda3-4ce7-b5a5-01bbbb65cb06"). InnerVolumeSpecName "kube-api-access-nrlmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:34:04 crc kubenswrapper[4696]: I0318 16:34:04.663786 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f077bd55-dda3-4ce7-b5a5-01bbbb65cb06-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f077bd55-dda3-4ce7-b5a5-01bbbb65cb06" (UID: "f077bd55-dda3-4ce7-b5a5-01bbbb65cb06"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:34:04 crc kubenswrapper[4696]: I0318 16:34:04.711054 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f077bd55-dda3-4ce7-b5a5-01bbbb65cb06-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:34:04 crc kubenswrapper[4696]: I0318 16:34:04.711091 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrlmg\" (UniqueName: \"kubernetes.io/projected/f077bd55-dda3-4ce7-b5a5-01bbbb65cb06-kube-api-access-nrlmg\") on node \"crc\" DevicePath \"\"" Mar 18 16:34:04 crc kubenswrapper[4696]: I0318 16:34:04.711104 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f077bd55-dda3-4ce7-b5a5-01bbbb65cb06-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:34:04 crc kubenswrapper[4696]: I0318 16:34:04.973999 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564194-662zt" Mar 18 16:34:04 crc kubenswrapper[4696]: I0318 16:34:04.973995 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564194-662zt" event={"ID":"5fa66a8b-5bc8-4701-8d68-2270067943a6","Type":"ContainerDied","Data":"65cf62db699f51d0e5bc695eb620d7fd5c92c296839d59d6058fd8c94828986f"} Mar 18 16:34:04 crc kubenswrapper[4696]: I0318 16:34:04.974556 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65cf62db699f51d0e5bc695eb620d7fd5c92c296839d59d6058fd8c94828986f" Mar 18 16:34:04 crc kubenswrapper[4696]: I0318 16:34:04.976325 4696 generic.go:334] "Generic (PLEG): container finished" podID="f077bd55-dda3-4ce7-b5a5-01bbbb65cb06" containerID="4dc5890161402f7223862da74fb9781f58542640ba6ea1434fc39ec1ad7f2298" exitCode=0 Mar 18 16:34:04 crc kubenswrapper[4696]: I0318 16:34:04.976358 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ldwb6" event={"ID":"f077bd55-dda3-4ce7-b5a5-01bbbb65cb06","Type":"ContainerDied","Data":"4dc5890161402f7223862da74fb9781f58542640ba6ea1434fc39ec1ad7f2298"} Mar 18 16:34:04 crc kubenswrapper[4696]: I0318 16:34:04.976372 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ldwb6" Mar 18 16:34:04 crc kubenswrapper[4696]: I0318 16:34:04.976383 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ldwb6" event={"ID":"f077bd55-dda3-4ce7-b5a5-01bbbb65cb06","Type":"ContainerDied","Data":"dadec6c049c9e14958892c8eff26411e127b5f72ef85e200f224a940e588c602"} Mar 18 16:34:04 crc kubenswrapper[4696]: I0318 16:34:04.976408 4696 scope.go:117] "RemoveContainer" containerID="4dc5890161402f7223862da74fb9781f58542640ba6ea1434fc39ec1ad7f2298" Mar 18 16:34:05 crc kubenswrapper[4696]: I0318 16:34:05.009716 4696 scope.go:117] "RemoveContainer" containerID="336685fec5e932c4e19e13d9c7e559a6e476583631d280ee61bbf3c8554804be" Mar 18 16:34:05 crc kubenswrapper[4696]: I0318 16:34:05.017681 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ldwb6"] Mar 18 16:34:05 crc kubenswrapper[4696]: I0318 16:34:05.028593 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ldwb6"] Mar 18 16:34:05 crc kubenswrapper[4696]: I0318 16:34:05.033625 4696 scope.go:117] "RemoveContainer" containerID="808ea17e7f3d85cb99004c909be8710df1aee23cce76e6932227eb8b8f70fe82" Mar 18 16:34:05 crc kubenswrapper[4696]: I0318 16:34:05.053075 4696 scope.go:117] "RemoveContainer" containerID="4dc5890161402f7223862da74fb9781f58542640ba6ea1434fc39ec1ad7f2298" Mar 18 16:34:05 crc kubenswrapper[4696]: E0318 16:34:05.053596 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dc5890161402f7223862da74fb9781f58542640ba6ea1434fc39ec1ad7f2298\": container with ID starting with 4dc5890161402f7223862da74fb9781f58542640ba6ea1434fc39ec1ad7f2298 not found: ID does not exist" containerID="4dc5890161402f7223862da74fb9781f58542640ba6ea1434fc39ec1ad7f2298" Mar 18 16:34:05 crc kubenswrapper[4696]: I0318 16:34:05.053643 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dc5890161402f7223862da74fb9781f58542640ba6ea1434fc39ec1ad7f2298"} err="failed to get container status \"4dc5890161402f7223862da74fb9781f58542640ba6ea1434fc39ec1ad7f2298\": rpc error: code = NotFound desc = could not find container \"4dc5890161402f7223862da74fb9781f58542640ba6ea1434fc39ec1ad7f2298\": container with ID starting with 4dc5890161402f7223862da74fb9781f58542640ba6ea1434fc39ec1ad7f2298 not found: ID does not exist" Mar 18 16:34:05 crc kubenswrapper[4696]: I0318 16:34:05.053673 4696 scope.go:117] "RemoveContainer" containerID="336685fec5e932c4e19e13d9c7e559a6e476583631d280ee61bbf3c8554804be" Mar 18 16:34:05 crc kubenswrapper[4696]: E0318 16:34:05.054205 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"336685fec5e932c4e19e13d9c7e559a6e476583631d280ee61bbf3c8554804be\": container with ID starting with 336685fec5e932c4e19e13d9c7e559a6e476583631d280ee61bbf3c8554804be not found: ID does not exist" containerID="336685fec5e932c4e19e13d9c7e559a6e476583631d280ee61bbf3c8554804be" Mar 18 16:34:05 crc kubenswrapper[4696]: I0318 16:34:05.054257 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"336685fec5e932c4e19e13d9c7e559a6e476583631d280ee61bbf3c8554804be"} err="failed to get container status \"336685fec5e932c4e19e13d9c7e559a6e476583631d280ee61bbf3c8554804be\": rpc error: code = NotFound desc = could not find container \"336685fec5e932c4e19e13d9c7e559a6e476583631d280ee61bbf3c8554804be\": container with ID starting with 336685fec5e932c4e19e13d9c7e559a6e476583631d280ee61bbf3c8554804be not found: ID does not exist" Mar 18 16:34:05 crc kubenswrapper[4696]: I0318 16:34:05.054295 4696 scope.go:117] "RemoveContainer" containerID="808ea17e7f3d85cb99004c909be8710df1aee23cce76e6932227eb8b8f70fe82" Mar 18 16:34:05 crc kubenswrapper[4696]: E0318 16:34:05.054811 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"808ea17e7f3d85cb99004c909be8710df1aee23cce76e6932227eb8b8f70fe82\": container with ID starting with 808ea17e7f3d85cb99004c909be8710df1aee23cce76e6932227eb8b8f70fe82 not found: ID does not exist" containerID="808ea17e7f3d85cb99004c909be8710df1aee23cce76e6932227eb8b8f70fe82" Mar 18 16:34:05 crc kubenswrapper[4696]: I0318 16:34:05.054837 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"808ea17e7f3d85cb99004c909be8710df1aee23cce76e6932227eb8b8f70fe82"} err="failed to get container status \"808ea17e7f3d85cb99004c909be8710df1aee23cce76e6932227eb8b8f70fe82\": rpc error: code = NotFound desc = could not find container \"808ea17e7f3d85cb99004c909be8710df1aee23cce76e6932227eb8b8f70fe82\": container with ID starting with 808ea17e7f3d85cb99004c909be8710df1aee23cce76e6932227eb8b8f70fe82 not found: ID does not exist" Mar 18 16:34:05 crc kubenswrapper[4696]: I0318 16:34:05.363585 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fqqpj" Mar 18 16:34:05 crc kubenswrapper[4696]: I0318 16:34:05.432650 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564188-2zs7b"] Mar 18 16:34:05 crc kubenswrapper[4696]: I0318 16:34:05.440231 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564188-2zs7b"] Mar 18 16:34:05 crc kubenswrapper[4696]: I0318 16:34:05.608172 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="674dbf40-fb2b-40a7-9019-ff7c8eac34c2" path="/var/lib/kubelet/pods/674dbf40-fb2b-40a7-9019-ff7c8eac34c2/volumes" Mar 18 16:34:05 crc kubenswrapper[4696]: I0318 16:34:05.608908 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f077bd55-dda3-4ce7-b5a5-01bbbb65cb06" path="/var/lib/kubelet/pods/f077bd55-dda3-4ce7-b5a5-01bbbb65cb06/volumes" Mar 18 16:34:07 crc kubenswrapper[4696]: I0318 16:34:07.958917 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fqqpj"] Mar 18 16:34:07 crc kubenswrapper[4696]: I0318 16:34:07.960057 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fqqpj" podUID="08e6e198-9c73-4158-b5f7-4f2aa39d3972" containerName="registry-server" containerID="cri-o://606c26a344520b25c53a1047dd793c94782244665c62efd5d8990edb329eed42" gracePeriod=2 Mar 18 16:34:08 crc kubenswrapper[4696]: E0318 16:34:08.148785 4696 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08e6e198_9c73_4158_b5f7_4f2aa39d3972.slice/crio-606c26a344520b25c53a1047dd793c94782244665c62efd5d8990edb329eed42.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08e6e198_9c73_4158_b5f7_4f2aa39d3972.slice/crio-conmon-606c26a344520b25c53a1047dd793c94782244665c62efd5d8990edb329eed42.scope\": RecentStats: unable to find data in memory cache]" Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.664122 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fqqpj" Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.771117 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08e6e198-9c73-4158-b5f7-4f2aa39d3972-catalog-content\") pod \"08e6e198-9c73-4158-b5f7-4f2aa39d3972\" (UID: \"08e6e198-9c73-4158-b5f7-4f2aa39d3972\") " Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.771785 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mrm7\" (UniqueName: \"kubernetes.io/projected/08e6e198-9c73-4158-b5f7-4f2aa39d3972-kube-api-access-8mrm7\") pod \"08e6e198-9c73-4158-b5f7-4f2aa39d3972\" (UID: \"08e6e198-9c73-4158-b5f7-4f2aa39d3972\") " Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.772122 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08e6e198-9c73-4158-b5f7-4f2aa39d3972-utilities\") pod \"08e6e198-9c73-4158-b5f7-4f2aa39d3972\" (UID: \"08e6e198-9c73-4158-b5f7-4f2aa39d3972\") " Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.772944 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08e6e198-9c73-4158-b5f7-4f2aa39d3972-utilities" (OuterVolumeSpecName: "utilities") pod "08e6e198-9c73-4158-b5f7-4f2aa39d3972" (UID: "08e6e198-9c73-4158-b5f7-4f2aa39d3972"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.779358 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08e6e198-9c73-4158-b5f7-4f2aa39d3972-kube-api-access-8mrm7" (OuterVolumeSpecName: "kube-api-access-8mrm7") pod "08e6e198-9c73-4158-b5f7-4f2aa39d3972" (UID: "08e6e198-9c73-4158-b5f7-4f2aa39d3972"). InnerVolumeSpecName "kube-api-access-8mrm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.825995 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08e6e198-9c73-4158-b5f7-4f2aa39d3972-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08e6e198-9c73-4158-b5f7-4f2aa39d3972" (UID: "08e6e198-9c73-4158-b5f7-4f2aa39d3972"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.879157 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08e6e198-9c73-4158-b5f7-4f2aa39d3972-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.879195 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08e6e198-9c73-4158-b5f7-4f2aa39d3972-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.879207 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mrm7\" (UniqueName: \"kubernetes.io/projected/08e6e198-9c73-4158-b5f7-4f2aa39d3972-kube-api-access-8mrm7\") on node \"crc\" DevicePath \"\"" Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.897743 4696 generic.go:334] "Generic (PLEG): container finished" podID="08e6e198-9c73-4158-b5f7-4f2aa39d3972" containerID="606c26a344520b25c53a1047dd793c94782244665c62efd5d8990edb329eed42" exitCode=0 Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.897782 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqqpj" event={"ID":"08e6e198-9c73-4158-b5f7-4f2aa39d3972","Type":"ContainerDied","Data":"606c26a344520b25c53a1047dd793c94782244665c62efd5d8990edb329eed42"} Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.897818 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqqpj" event={"ID":"08e6e198-9c73-4158-b5f7-4f2aa39d3972","Type":"ContainerDied","Data":"0b2a3bb998ca1aa1b5fcaa7392bdfbb7e09f5c5e9d671c08a1e4b6f869103f1f"} Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.897834 4696 scope.go:117] "RemoveContainer" containerID="606c26a344520b25c53a1047dd793c94782244665c62efd5d8990edb329eed42" Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.897772 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fqqpj" Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.925105 4696 scope.go:117] "RemoveContainer" containerID="828e1e2873453d017135e43f231af5825193bccc342187a7dac65c37e6edb36e" Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.936307 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fqqpj"] Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.949625 4696 scope.go:117] "RemoveContainer" containerID="5df637ce18d1c44dcbb398762c9a90f616e03645abaee0dc03ea16637a9f95ce" Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.958441 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fqqpj"] Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.994317 4696 scope.go:117] "RemoveContainer" containerID="606c26a344520b25c53a1047dd793c94782244665c62efd5d8990edb329eed42" Mar 18 16:34:08 crc kubenswrapper[4696]: E0318 16:34:08.994817 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"606c26a344520b25c53a1047dd793c94782244665c62efd5d8990edb329eed42\": container with ID starting with 606c26a344520b25c53a1047dd793c94782244665c62efd5d8990edb329eed42 not found: ID does not exist" containerID="606c26a344520b25c53a1047dd793c94782244665c62efd5d8990edb329eed42" Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.994860 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"606c26a344520b25c53a1047dd793c94782244665c62efd5d8990edb329eed42"} err="failed to get container status \"606c26a344520b25c53a1047dd793c94782244665c62efd5d8990edb329eed42\": rpc error: code = NotFound desc = could not find container \"606c26a344520b25c53a1047dd793c94782244665c62efd5d8990edb329eed42\": container with ID starting with 606c26a344520b25c53a1047dd793c94782244665c62efd5d8990edb329eed42 not found: ID does not exist" Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.994918 4696 scope.go:117] "RemoveContainer" containerID="828e1e2873453d017135e43f231af5825193bccc342187a7dac65c37e6edb36e" Mar 18 16:34:08 crc kubenswrapper[4696]: E0318 16:34:08.995445 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"828e1e2873453d017135e43f231af5825193bccc342187a7dac65c37e6edb36e\": container with ID starting with 828e1e2873453d017135e43f231af5825193bccc342187a7dac65c37e6edb36e not found: ID does not exist" containerID="828e1e2873453d017135e43f231af5825193bccc342187a7dac65c37e6edb36e" Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.995473 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"828e1e2873453d017135e43f231af5825193bccc342187a7dac65c37e6edb36e"} err="failed to get container status \"828e1e2873453d017135e43f231af5825193bccc342187a7dac65c37e6edb36e\": rpc error: code = NotFound desc = could not find container \"828e1e2873453d017135e43f231af5825193bccc342187a7dac65c37e6edb36e\": container with ID starting with 828e1e2873453d017135e43f231af5825193bccc342187a7dac65c37e6edb36e not found: ID does not exist" Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.995494 4696 scope.go:117] "RemoveContainer" containerID="5df637ce18d1c44dcbb398762c9a90f616e03645abaee0dc03ea16637a9f95ce" Mar 18 16:34:08 crc kubenswrapper[4696]: E0318 16:34:08.995909 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5df637ce18d1c44dcbb398762c9a90f616e03645abaee0dc03ea16637a9f95ce\": container with ID starting with 5df637ce18d1c44dcbb398762c9a90f616e03645abaee0dc03ea16637a9f95ce not found: ID does not exist" containerID="5df637ce18d1c44dcbb398762c9a90f616e03645abaee0dc03ea16637a9f95ce" Mar 18 16:34:08 crc kubenswrapper[4696]: I0318 16:34:08.995958 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5df637ce18d1c44dcbb398762c9a90f616e03645abaee0dc03ea16637a9f95ce"} err="failed to get container status \"5df637ce18d1c44dcbb398762c9a90f616e03645abaee0dc03ea16637a9f95ce\": rpc error: code = NotFound desc = could not find container \"5df637ce18d1c44dcbb398762c9a90f616e03645abaee0dc03ea16637a9f95ce\": container with ID starting with 5df637ce18d1c44dcbb398762c9a90f616e03645abaee0dc03ea16637a9f95ce not found: ID does not exist" Mar 18 16:34:09 crc kubenswrapper[4696]: I0318 16:34:09.615580 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08e6e198-9c73-4158-b5f7-4f2aa39d3972" path="/var/lib/kubelet/pods/08e6e198-9c73-4158-b5f7-4f2aa39d3972/volumes" Mar 18 16:34:12 crc kubenswrapper[4696]: I0318 16:34:12.184784 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:34:12 crc kubenswrapper[4696]: I0318 16:34:12.185328 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:34:24 crc kubenswrapper[4696]: I0318 16:34:24.528301 4696 scope.go:117] "RemoveContainer" containerID="a42f33bc9910bd5b726d6177ab62670f66b774347a9d6012f6b1e0687c0c2e46" Mar 18 16:34:42 crc kubenswrapper[4696]: I0318 16:34:42.185094 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:34:42 crc kubenswrapper[4696]: I0318 16:34:42.186745 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:35:12 crc kubenswrapper[4696]: I0318 16:35:12.184581 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:35:12 crc kubenswrapper[4696]: I0318 16:35:12.185300 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:35:12 crc kubenswrapper[4696]: I0318 16:35:12.185373 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 16:35:12 crc kubenswrapper[4696]: I0318 16:35:12.186316 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf86dadce3d1fb257b7fe8784bc21654ea04954f310cfd6357d78c465ecfc986"} pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:35:12 crc kubenswrapper[4696]: I0318 16:35:12.186399 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" containerID="cri-o://cf86dadce3d1fb257b7fe8784bc21654ea04954f310cfd6357d78c465ecfc986" gracePeriod=600 Mar 18 16:35:12 crc kubenswrapper[4696]: I0318 16:35:12.704871 4696 generic.go:334] "Generic (PLEG): container finished" podID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerID="cf86dadce3d1fb257b7fe8784bc21654ea04954f310cfd6357d78c465ecfc986" exitCode=0 Mar 18 16:35:12 crc kubenswrapper[4696]: I0318 16:35:12.704949 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerDied","Data":"cf86dadce3d1fb257b7fe8784bc21654ea04954f310cfd6357d78c465ecfc986"} Mar 18 16:35:12 crc kubenswrapper[4696]: I0318 16:35:12.705288 4696 scope.go:117] "RemoveContainer" containerID="7949f9450c51f40ee7f760755610ade732cea5a0ef5ce6f33429474b9df89b17" Mar 18 16:35:13 crc kubenswrapper[4696]: I0318 16:35:13.714549 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerStarted","Data":"16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942"} Mar 18 16:35:29 crc kubenswrapper[4696]: I0318 16:35:29.886749 4696 generic.go:334] "Generic (PLEG): container finished" podID="0b6d2f26-746f-404e-817e-ca3b65cc9511" containerID="902bc91e69e9e251b788b2b65ef20041bdb77846792913500c92c9f1a127bac6" exitCode=0 Mar 18 16:35:29 crc kubenswrapper[4696]: I0318 16:35:29.888418 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0b6d2f26-746f-404e-817e-ca3b65cc9511","Type":"ContainerDied","Data":"902bc91e69e9e251b788b2b65ef20041bdb77846792913500c92c9f1a127bac6"} Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.281595 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.399103 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0b6d2f26-746f-404e-817e-ca3b65cc9511-test-operator-ephemeral-temporary\") pod \"0b6d2f26-746f-404e-817e-ca3b65cc9511\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.399170 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0b6d2f26-746f-404e-817e-ca3b65cc9511-openstack-config\") pod \"0b6d2f26-746f-404e-817e-ca3b65cc9511\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.399203 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0b6d2f26-746f-404e-817e-ca3b65cc9511-ca-certs\") pod \"0b6d2f26-746f-404e-817e-ca3b65cc9511\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.399275 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b6d2f26-746f-404e-817e-ca3b65cc9511-config-data\") pod \"0b6d2f26-746f-404e-817e-ca3b65cc9511\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.399323 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b6d2f26-746f-404e-817e-ca3b65cc9511-ssh-key\") pod \"0b6d2f26-746f-404e-817e-ca3b65cc9511\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.399352 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4jwc\" (UniqueName: \"kubernetes.io/projected/0b6d2f26-746f-404e-817e-ca3b65cc9511-kube-api-access-h4jwc\") pod \"0b6d2f26-746f-404e-817e-ca3b65cc9511\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.399476 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0b6d2f26-746f-404e-817e-ca3b65cc9511-openstack-config-secret\") pod \"0b6d2f26-746f-404e-817e-ca3b65cc9511\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.399546 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0b6d2f26-746f-404e-817e-ca3b65cc9511-test-operator-ephemeral-workdir\") pod \"0b6d2f26-746f-404e-817e-ca3b65cc9511\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.399570 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"0b6d2f26-746f-404e-817e-ca3b65cc9511\" (UID: \"0b6d2f26-746f-404e-817e-ca3b65cc9511\") " Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.401550 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b6d2f26-746f-404e-817e-ca3b65cc9511-config-data" (OuterVolumeSpecName: "config-data") pod "0b6d2f26-746f-404e-817e-ca3b65cc9511" (UID: "0b6d2f26-746f-404e-817e-ca3b65cc9511"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.403894 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b6d2f26-746f-404e-817e-ca3b65cc9511-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "0b6d2f26-746f-404e-817e-ca3b65cc9511" (UID: "0b6d2f26-746f-404e-817e-ca3b65cc9511"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.404219 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b6d2f26-746f-404e-817e-ca3b65cc9511-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "0b6d2f26-746f-404e-817e-ca3b65cc9511" (UID: "0b6d2f26-746f-404e-817e-ca3b65cc9511"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.406463 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b6d2f26-746f-404e-817e-ca3b65cc9511-kube-api-access-h4jwc" (OuterVolumeSpecName: "kube-api-access-h4jwc") pod "0b6d2f26-746f-404e-817e-ca3b65cc9511" (UID: "0b6d2f26-746f-404e-817e-ca3b65cc9511"). InnerVolumeSpecName "kube-api-access-h4jwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.406542 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "0b6d2f26-746f-404e-817e-ca3b65cc9511" (UID: "0b6d2f26-746f-404e-817e-ca3b65cc9511"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.434270 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b6d2f26-746f-404e-817e-ca3b65cc9511-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0b6d2f26-746f-404e-817e-ca3b65cc9511" (UID: "0b6d2f26-746f-404e-817e-ca3b65cc9511"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.439276 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b6d2f26-746f-404e-817e-ca3b65cc9511-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "0b6d2f26-746f-404e-817e-ca3b65cc9511" (UID: "0b6d2f26-746f-404e-817e-ca3b65cc9511"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.458326 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b6d2f26-746f-404e-817e-ca3b65cc9511-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "0b6d2f26-746f-404e-817e-ca3b65cc9511" (UID: "0b6d2f26-746f-404e-817e-ca3b65cc9511"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.458539 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b6d2f26-746f-404e-817e-ca3b65cc9511-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "0b6d2f26-746f-404e-817e-ca3b65cc9511" (UID: "0b6d2f26-746f-404e-817e-ca3b65cc9511"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.501951 4696 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0b6d2f26-746f-404e-817e-ca3b65cc9511-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.501991 4696 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0b6d2f26-746f-404e-817e-ca3b65cc9511-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.502024 4696 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.502038 4696 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0b6d2f26-746f-404e-817e-ca3b65cc9511-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.502052 4696 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0b6d2f26-746f-404e-817e-ca3b65cc9511-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.502065 4696 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0b6d2f26-746f-404e-817e-ca3b65cc9511-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.502076 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b6d2f26-746f-404e-817e-ca3b65cc9511-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.502086 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b6d2f26-746f-404e-817e-ca3b65cc9511-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.502096 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4jwc\" (UniqueName: \"kubernetes.io/projected/0b6d2f26-746f-404e-817e-ca3b65cc9511-kube-api-access-h4jwc\") on node \"crc\" DevicePath \"\"" Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.525899 4696 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.604012 4696 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.905948 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0b6d2f26-746f-404e-817e-ca3b65cc9511","Type":"ContainerDied","Data":"f6913b29e714cfe5157a85027f2b20af34cd120321718f63a3b3e3d303995ddd"} Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.906027 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6913b29e714cfe5157a85027f2b20af34cd120321718f63a3b3e3d303995ddd" Mar 18 16:35:31 crc kubenswrapper[4696]: I0318 16:35:31.905993 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.192562 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 18 16:35:39 crc kubenswrapper[4696]: E0318 16:35:39.193471 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e6e198-9c73-4158-b5f7-4f2aa39d3972" containerName="extract-content" Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.193485 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e6e198-9c73-4158-b5f7-4f2aa39d3972" containerName="extract-content" Mar 18 16:35:39 crc kubenswrapper[4696]: E0318 16:35:39.193498 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f077bd55-dda3-4ce7-b5a5-01bbbb65cb06" containerName="registry-server" Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.193505 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f077bd55-dda3-4ce7-b5a5-01bbbb65cb06" containerName="registry-server" Mar 18 16:35:39 crc kubenswrapper[4696]: E0318 16:35:39.193538 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e6e198-9c73-4158-b5f7-4f2aa39d3972" containerName="registry-server" Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.193546 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e6e198-9c73-4158-b5f7-4f2aa39d3972" containerName="registry-server" Mar 18 16:35:39 crc kubenswrapper[4696]: E0318 16:35:39.193560 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f077bd55-dda3-4ce7-b5a5-01bbbb65cb06" containerName="extract-utilities" Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.193568 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f077bd55-dda3-4ce7-b5a5-01bbbb65cb06" containerName="extract-utilities" Mar 18 16:35:39 crc kubenswrapper[4696]: E0318 16:35:39.193588 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f077bd55-dda3-4ce7-b5a5-01bbbb65cb06" containerName="extract-content" Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.193594 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f077bd55-dda3-4ce7-b5a5-01bbbb65cb06" containerName="extract-content" Mar 18 16:35:39 crc kubenswrapper[4696]: E0318 16:35:39.193607 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fa66a8b-5bc8-4701-8d68-2270067943a6" containerName="oc" Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.193620 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fa66a8b-5bc8-4701-8d68-2270067943a6" containerName="oc" Mar 18 16:35:39 crc kubenswrapper[4696]: E0318 16:35:39.193638 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b6d2f26-746f-404e-817e-ca3b65cc9511" containerName="tempest-tests-tempest-tests-runner" Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.193646 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b6d2f26-746f-404e-817e-ca3b65cc9511" containerName="tempest-tests-tempest-tests-runner" Mar 18 16:35:39 crc kubenswrapper[4696]: E0318 16:35:39.193660 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e6e198-9c73-4158-b5f7-4f2aa39d3972" containerName="extract-utilities" Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.193667 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e6e198-9c73-4158-b5f7-4f2aa39d3972" containerName="extract-utilities" Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.193900 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fa66a8b-5bc8-4701-8d68-2270067943a6" containerName="oc" Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.193931 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b6d2f26-746f-404e-817e-ca3b65cc9511" containerName="tempest-tests-tempest-tests-runner" Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.193939 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="08e6e198-9c73-4158-b5f7-4f2aa39d3972" containerName="registry-server" Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.193950 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f077bd55-dda3-4ce7-b5a5-01bbbb65cb06" containerName="registry-server" Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.194549 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.196427 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-mh9cr" Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.201333 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.373476 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3166f000-157f-4751-a28c-a2e5f5caa4d9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.373622 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnp4k\" (UniqueName: \"kubernetes.io/projected/3166f000-157f-4751-a28c-a2e5f5caa4d9-kube-api-access-wnp4k\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3166f000-157f-4751-a28c-a2e5f5caa4d9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.475747 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnp4k\" (UniqueName: \"kubernetes.io/projected/3166f000-157f-4751-a28c-a2e5f5caa4d9-kube-api-access-wnp4k\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3166f000-157f-4751-a28c-a2e5f5caa4d9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.475945 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3166f000-157f-4751-a28c-a2e5f5caa4d9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.476433 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3166f000-157f-4751-a28c-a2e5f5caa4d9\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.501976 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3166f000-157f-4751-a28c-a2e5f5caa4d9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.504818 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnp4k\" (UniqueName: \"kubernetes.io/projected/3166f000-157f-4751-a28c-a2e5f5caa4d9-kube-api-access-wnp4k\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3166f000-157f-4751-a28c-a2e5f5caa4d9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.521899 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.977417 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 18 16:35:39 crc kubenswrapper[4696]: I0318 16:35:39.983970 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:35:40 crc kubenswrapper[4696]: I0318 16:35:40.988471 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"3166f000-157f-4751-a28c-a2e5f5caa4d9","Type":"ContainerStarted","Data":"6dca9ef8d9aede69e0fed8a8edcd4559a2accfd5b4b8cb4d23a106922043357a"} Mar 18 16:35:42 crc kubenswrapper[4696]: I0318 16:35:42.000130 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"3166f000-157f-4751-a28c-a2e5f5caa4d9","Type":"ContainerStarted","Data":"6c3e8863739c926c4a4edc906a9b329893ce00d9e814d8a7eb2f14d7017265be"} Mar 18 16:35:42 crc kubenswrapper[4696]: I0318 16:35:42.025617 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.010594131 podStartE2EDuration="3.025591145s" podCreationTimestamp="2026-03-18 16:35:39 +0000 UTC" firstStartedPulling="2026-03-18 16:35:39.983512214 +0000 UTC m=+3582.989686440" lastFinishedPulling="2026-03-18 16:35:40.998509248 +0000 UTC m=+3584.004683454" observedRunningTime="2026-03-18 16:35:42.012977469 +0000 UTC m=+3585.019151675" watchObservedRunningTime="2026-03-18 16:35:42.025591145 +0000 UTC m=+3585.031765361" Mar 18 16:36:00 crc kubenswrapper[4696]: I0318 16:36:00.141195 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564196-7ph7t"] Mar 18 16:36:00 crc kubenswrapper[4696]: I0318 16:36:00.143382 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564196-7ph7t" Mar 18 16:36:00 crc kubenswrapper[4696]: I0318 16:36:00.145752 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:36:00 crc kubenswrapper[4696]: I0318 16:36:00.145884 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:36:00 crc kubenswrapper[4696]: I0318 16:36:00.146758 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:36:00 crc kubenswrapper[4696]: I0318 16:36:00.162920 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564196-7ph7t"] Mar 18 16:36:00 crc kubenswrapper[4696]: I0318 16:36:00.166821 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdxrg\" (UniqueName: \"kubernetes.io/projected/a00ab367-0a0b-44c1-bd91-d54036751bc3-kube-api-access-rdxrg\") pod \"auto-csr-approver-29564196-7ph7t\" (UID: \"a00ab367-0a0b-44c1-bd91-d54036751bc3\") " pod="openshift-infra/auto-csr-approver-29564196-7ph7t" Mar 18 16:36:00 crc kubenswrapper[4696]: I0318 16:36:00.268825 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdxrg\" (UniqueName: \"kubernetes.io/projected/a00ab367-0a0b-44c1-bd91-d54036751bc3-kube-api-access-rdxrg\") pod \"auto-csr-approver-29564196-7ph7t\" (UID: \"a00ab367-0a0b-44c1-bd91-d54036751bc3\") " pod="openshift-infra/auto-csr-approver-29564196-7ph7t" Mar 18 16:36:00 crc kubenswrapper[4696]: I0318 16:36:00.299456 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdxrg\" (UniqueName: \"kubernetes.io/projected/a00ab367-0a0b-44c1-bd91-d54036751bc3-kube-api-access-rdxrg\") pod \"auto-csr-approver-29564196-7ph7t\" (UID: \"a00ab367-0a0b-44c1-bd91-d54036751bc3\") " pod="openshift-infra/auto-csr-approver-29564196-7ph7t" Mar 18 16:36:00 crc kubenswrapper[4696]: I0318 16:36:00.469026 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564196-7ph7t" Mar 18 16:36:00 crc kubenswrapper[4696]: I0318 16:36:00.936954 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564196-7ph7t"] Mar 18 16:36:00 crc kubenswrapper[4696]: W0318 16:36:00.943175 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda00ab367_0a0b_44c1_bd91_d54036751bc3.slice/crio-c4fd112dd89f690272e4b9beb9fbbdb31e760ac92d6a093835092489496d7ab1 WatchSource:0}: Error finding container c4fd112dd89f690272e4b9beb9fbbdb31e760ac92d6a093835092489496d7ab1: Status 404 returned error can't find the container with id c4fd112dd89f690272e4b9beb9fbbdb31e760ac92d6a093835092489496d7ab1 Mar 18 16:36:01 crc kubenswrapper[4696]: I0318 16:36:01.175273 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564196-7ph7t" event={"ID":"a00ab367-0a0b-44c1-bd91-d54036751bc3","Type":"ContainerStarted","Data":"c4fd112dd89f690272e4b9beb9fbbdb31e760ac92d6a093835092489496d7ab1"} Mar 18 16:36:03 crc kubenswrapper[4696]: I0318 16:36:03.199865 4696 generic.go:334] "Generic (PLEG): container finished" podID="a00ab367-0a0b-44c1-bd91-d54036751bc3" containerID="46c125917c31de0740cf1a058d88adcbf327d40dc6a7e118d09a377126f31485" exitCode=0 Mar 18 16:36:03 crc kubenswrapper[4696]: I0318 16:36:03.200623 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564196-7ph7t" event={"ID":"a00ab367-0a0b-44c1-bd91-d54036751bc3","Type":"ContainerDied","Data":"46c125917c31de0740cf1a058d88adcbf327d40dc6a7e118d09a377126f31485"} Mar 18 16:36:03 crc kubenswrapper[4696]: I0318 16:36:03.744110 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ng8w4/must-gather-9rxkb"] Mar 18 16:36:03 crc kubenswrapper[4696]: I0318 16:36:03.745927 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ng8w4/must-gather-9rxkb" Mar 18 16:36:03 crc kubenswrapper[4696]: I0318 16:36:03.749219 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ng8w4"/"openshift-service-ca.crt" Mar 18 16:36:03 crc kubenswrapper[4696]: I0318 16:36:03.761315 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ng8w4/must-gather-9rxkb"] Mar 18 16:36:03 crc kubenswrapper[4696]: I0318 16:36:03.761781 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-ng8w4"/"default-dockercfg-v9jlc" Mar 18 16:36:03 crc kubenswrapper[4696]: I0318 16:36:03.762057 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ng8w4"/"kube-root-ca.crt" Mar 18 16:36:03 crc kubenswrapper[4696]: I0318 16:36:03.847174 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7293f441-53f0-4872-9d79-c1198766aa86-must-gather-output\") pod \"must-gather-9rxkb\" (UID: \"7293f441-53f0-4872-9d79-c1198766aa86\") " pod="openshift-must-gather-ng8w4/must-gather-9rxkb" Mar 18 16:36:03 crc kubenswrapper[4696]: I0318 16:36:03.847212 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljrhx\" (UniqueName: \"kubernetes.io/projected/7293f441-53f0-4872-9d79-c1198766aa86-kube-api-access-ljrhx\") pod \"must-gather-9rxkb\" (UID: \"7293f441-53f0-4872-9d79-c1198766aa86\") " pod="openshift-must-gather-ng8w4/must-gather-9rxkb" Mar 18 16:36:03 crc kubenswrapper[4696]: I0318 16:36:03.949133 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7293f441-53f0-4872-9d79-c1198766aa86-must-gather-output\") pod \"must-gather-9rxkb\" (UID: \"7293f441-53f0-4872-9d79-c1198766aa86\") " pod="openshift-must-gather-ng8w4/must-gather-9rxkb" Mar 18 16:36:03 crc kubenswrapper[4696]: I0318 16:36:03.949186 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljrhx\" (UniqueName: \"kubernetes.io/projected/7293f441-53f0-4872-9d79-c1198766aa86-kube-api-access-ljrhx\") pod \"must-gather-9rxkb\" (UID: \"7293f441-53f0-4872-9d79-c1198766aa86\") " pod="openshift-must-gather-ng8w4/must-gather-9rxkb" Mar 18 16:36:03 crc kubenswrapper[4696]: I0318 16:36:03.949827 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7293f441-53f0-4872-9d79-c1198766aa86-must-gather-output\") pod \"must-gather-9rxkb\" (UID: \"7293f441-53f0-4872-9d79-c1198766aa86\") " pod="openshift-must-gather-ng8w4/must-gather-9rxkb" Mar 18 16:36:03 crc kubenswrapper[4696]: I0318 16:36:03.968017 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljrhx\" (UniqueName: \"kubernetes.io/projected/7293f441-53f0-4872-9d79-c1198766aa86-kube-api-access-ljrhx\") pod \"must-gather-9rxkb\" (UID: \"7293f441-53f0-4872-9d79-c1198766aa86\") " pod="openshift-must-gather-ng8w4/must-gather-9rxkb" Mar 18 16:36:04 crc kubenswrapper[4696]: I0318 16:36:04.063804 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ng8w4/must-gather-9rxkb" Mar 18 16:36:04 crc kubenswrapper[4696]: I0318 16:36:04.616154 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564196-7ph7t" Mar 18 16:36:04 crc kubenswrapper[4696]: I0318 16:36:04.718004 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ng8w4/must-gather-9rxkb"] Mar 18 16:36:04 crc kubenswrapper[4696]: I0318 16:36:04.765421 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdxrg\" (UniqueName: \"kubernetes.io/projected/a00ab367-0a0b-44c1-bd91-d54036751bc3-kube-api-access-rdxrg\") pod \"a00ab367-0a0b-44c1-bd91-d54036751bc3\" (UID: \"a00ab367-0a0b-44c1-bd91-d54036751bc3\") " Mar 18 16:36:04 crc kubenswrapper[4696]: I0318 16:36:04.772613 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a00ab367-0a0b-44c1-bd91-d54036751bc3-kube-api-access-rdxrg" (OuterVolumeSpecName: "kube-api-access-rdxrg") pod "a00ab367-0a0b-44c1-bd91-d54036751bc3" (UID: "a00ab367-0a0b-44c1-bd91-d54036751bc3"). InnerVolumeSpecName "kube-api-access-rdxrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:36:04 crc kubenswrapper[4696]: I0318 16:36:04.868947 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdxrg\" (UniqueName: \"kubernetes.io/projected/a00ab367-0a0b-44c1-bd91-d54036751bc3-kube-api-access-rdxrg\") on node \"crc\" DevicePath \"\"" Mar 18 16:36:05 crc kubenswrapper[4696]: I0318 16:36:05.220164 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ng8w4/must-gather-9rxkb" event={"ID":"7293f441-53f0-4872-9d79-c1198766aa86","Type":"ContainerStarted","Data":"1b2d1b752dee90dac7882204c23406ddcf1b18cea7686649aadd8a01e43f5990"} Mar 18 16:36:05 crc kubenswrapper[4696]: I0318 16:36:05.221775 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564196-7ph7t" event={"ID":"a00ab367-0a0b-44c1-bd91-d54036751bc3","Type":"ContainerDied","Data":"c4fd112dd89f690272e4b9beb9fbbdb31e760ac92d6a093835092489496d7ab1"} Mar 18 16:36:05 crc kubenswrapper[4696]: I0318 16:36:05.221797 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4fd112dd89f690272e4b9beb9fbbdb31e760ac92d6a093835092489496d7ab1" Mar 18 16:36:05 crc kubenswrapper[4696]: I0318 16:36:05.221842 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564196-7ph7t" Mar 18 16:36:05 crc kubenswrapper[4696]: I0318 16:36:05.687368 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564190-jfx4w"] Mar 18 16:36:05 crc kubenswrapper[4696]: I0318 16:36:05.699584 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564190-jfx4w"] Mar 18 16:36:07 crc kubenswrapper[4696]: I0318 16:36:07.611024 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e065bd04-e1af-434e-bc60-2c45b41de3d4" path="/var/lib/kubelet/pods/e065bd04-e1af-434e-bc60-2c45b41de3d4/volumes" Mar 18 16:36:11 crc kubenswrapper[4696]: I0318 16:36:11.278607 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ng8w4/must-gather-9rxkb" event={"ID":"7293f441-53f0-4872-9d79-c1198766aa86","Type":"ContainerStarted","Data":"d86441c8d99f7bb010e60a89e59299ba9d4fcb66f33de1d5276f2f3194814404"} Mar 18 16:36:11 crc kubenswrapper[4696]: I0318 16:36:11.279209 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ng8w4/must-gather-9rxkb" event={"ID":"7293f441-53f0-4872-9d79-c1198766aa86","Type":"ContainerStarted","Data":"b2d87c0a02b40a5666781ec6a05a06e5158b12645914a14bf1e86cc5580a5f7f"} Mar 18 16:36:11 crc kubenswrapper[4696]: I0318 16:36:11.310118 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ng8w4/must-gather-9rxkb" podStartSLOduration=2.6547459890000003 podStartE2EDuration="8.310099426s" podCreationTimestamp="2026-03-18 16:36:03 +0000 UTC" firstStartedPulling="2026-03-18 16:36:04.732948565 +0000 UTC m=+3607.739122771" lastFinishedPulling="2026-03-18 16:36:10.388302012 +0000 UTC m=+3613.394476208" observedRunningTime="2026-03-18 16:36:11.301276684 +0000 UTC m=+3614.307450890" watchObservedRunningTime="2026-03-18 16:36:11.310099426 +0000 UTC m=+3614.316273632" Mar 18 16:36:13 crc kubenswrapper[4696]: E0318 16:36:13.775767 4696 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.192:57926->38.102.83.192:37641: write tcp 38.102.83.192:57926->38.102.83.192:37641: write: broken pipe Mar 18 16:36:14 crc kubenswrapper[4696]: I0318 16:36:14.475097 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ng8w4/crc-debug-k9plt"] Mar 18 16:36:14 crc kubenswrapper[4696]: E0318 16:36:14.475563 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a00ab367-0a0b-44c1-bd91-d54036751bc3" containerName="oc" Mar 18 16:36:14 crc kubenswrapper[4696]: I0318 16:36:14.475585 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a00ab367-0a0b-44c1-bd91-d54036751bc3" containerName="oc" Mar 18 16:36:14 crc kubenswrapper[4696]: I0318 16:36:14.475781 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a00ab367-0a0b-44c1-bd91-d54036751bc3" containerName="oc" Mar 18 16:36:14 crc kubenswrapper[4696]: I0318 16:36:14.476436 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ng8w4/crc-debug-k9plt" Mar 18 16:36:14 crc kubenswrapper[4696]: I0318 16:36:14.555588 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snzfc\" (UniqueName: \"kubernetes.io/projected/a33478c7-526f-406d-9989-20df85ba5b54-kube-api-access-snzfc\") pod \"crc-debug-k9plt\" (UID: \"a33478c7-526f-406d-9989-20df85ba5b54\") " pod="openshift-must-gather-ng8w4/crc-debug-k9plt" Mar 18 16:36:14 crc kubenswrapper[4696]: I0318 16:36:14.555676 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a33478c7-526f-406d-9989-20df85ba5b54-host\") pod \"crc-debug-k9plt\" (UID: \"a33478c7-526f-406d-9989-20df85ba5b54\") " pod="openshift-must-gather-ng8w4/crc-debug-k9plt" Mar 18 16:36:14 crc kubenswrapper[4696]: I0318 16:36:14.657706 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snzfc\" (UniqueName: \"kubernetes.io/projected/a33478c7-526f-406d-9989-20df85ba5b54-kube-api-access-snzfc\") pod \"crc-debug-k9plt\" (UID: \"a33478c7-526f-406d-9989-20df85ba5b54\") " pod="openshift-must-gather-ng8w4/crc-debug-k9plt" Mar 18 16:36:14 crc kubenswrapper[4696]: I0318 16:36:14.657791 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a33478c7-526f-406d-9989-20df85ba5b54-host\") pod \"crc-debug-k9plt\" (UID: \"a33478c7-526f-406d-9989-20df85ba5b54\") " pod="openshift-must-gather-ng8w4/crc-debug-k9plt" Mar 18 16:36:14 crc kubenswrapper[4696]: I0318 16:36:14.657945 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a33478c7-526f-406d-9989-20df85ba5b54-host\") pod \"crc-debug-k9plt\" (UID: \"a33478c7-526f-406d-9989-20df85ba5b54\") " pod="openshift-must-gather-ng8w4/crc-debug-k9plt" Mar 18 16:36:14 crc kubenswrapper[4696]: I0318 16:36:14.676623 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snzfc\" (UniqueName: \"kubernetes.io/projected/a33478c7-526f-406d-9989-20df85ba5b54-kube-api-access-snzfc\") pod \"crc-debug-k9plt\" (UID: \"a33478c7-526f-406d-9989-20df85ba5b54\") " pod="openshift-must-gather-ng8w4/crc-debug-k9plt" Mar 18 16:36:14 crc kubenswrapper[4696]: I0318 16:36:14.794484 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ng8w4/crc-debug-k9plt" Mar 18 16:36:15 crc kubenswrapper[4696]: I0318 16:36:15.317578 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ng8w4/crc-debug-k9plt" event={"ID":"a33478c7-526f-406d-9989-20df85ba5b54","Type":"ContainerStarted","Data":"9a5a1e9812aae6c59fd0fc92058cb602592bfc4a2c5f3a9ac1e4d09048f886c6"} Mar 18 16:36:24 crc kubenswrapper[4696]: I0318 16:36:24.648054 4696 scope.go:117] "RemoveContainer" containerID="abddb6636fdf0314ab2ed4753401581e6c0d02f041c33fd4f21bb9a61c7ae616" Mar 18 16:36:26 crc kubenswrapper[4696]: I0318 16:36:26.446418 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ng8w4/crc-debug-k9plt" event={"ID":"a33478c7-526f-406d-9989-20df85ba5b54","Type":"ContainerStarted","Data":"e95b3d4dfa6b5c20a18833ec0fef1e4bdcc1a048e8456f6bd60e06b075c2e16e"} Mar 18 16:36:26 crc kubenswrapper[4696]: I0318 16:36:26.467367 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ng8w4/crc-debug-k9plt" podStartSLOduration=1.7773535329999999 podStartE2EDuration="12.467350809s" podCreationTimestamp="2026-03-18 16:36:14 +0000 UTC" firstStartedPulling="2026-03-18 16:36:14.917669459 +0000 UTC m=+3617.923843665" lastFinishedPulling="2026-03-18 16:36:25.607666725 +0000 UTC m=+3628.613840941" observedRunningTime="2026-03-18 16:36:26.461741008 +0000 UTC m=+3629.467915214" watchObservedRunningTime="2026-03-18 16:36:26.467350809 +0000 UTC m=+3629.473525015" Mar 18 16:37:07 crc kubenswrapper[4696]: I0318 16:37:07.845201 4696 generic.go:334] "Generic (PLEG): container finished" podID="a33478c7-526f-406d-9989-20df85ba5b54" containerID="e95b3d4dfa6b5c20a18833ec0fef1e4bdcc1a048e8456f6bd60e06b075c2e16e" exitCode=0 Mar 18 16:37:07 crc kubenswrapper[4696]: I0318 16:37:07.845735 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ng8w4/crc-debug-k9plt" event={"ID":"a33478c7-526f-406d-9989-20df85ba5b54","Type":"ContainerDied","Data":"e95b3d4dfa6b5c20a18833ec0fef1e4bdcc1a048e8456f6bd60e06b075c2e16e"} Mar 18 16:37:09 crc kubenswrapper[4696]: I0318 16:37:09.002137 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ng8w4/crc-debug-k9plt" Mar 18 16:37:09 crc kubenswrapper[4696]: I0318 16:37:09.038182 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ng8w4/crc-debug-k9plt"] Mar 18 16:37:09 crc kubenswrapper[4696]: I0318 16:37:09.049528 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ng8w4/crc-debug-k9plt"] Mar 18 16:37:09 crc kubenswrapper[4696]: I0318 16:37:09.154874 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snzfc\" (UniqueName: \"kubernetes.io/projected/a33478c7-526f-406d-9989-20df85ba5b54-kube-api-access-snzfc\") pod \"a33478c7-526f-406d-9989-20df85ba5b54\" (UID: \"a33478c7-526f-406d-9989-20df85ba5b54\") " Mar 18 16:37:09 crc kubenswrapper[4696]: I0318 16:37:09.155378 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a33478c7-526f-406d-9989-20df85ba5b54-host\") pod \"a33478c7-526f-406d-9989-20df85ba5b54\" (UID: \"a33478c7-526f-406d-9989-20df85ba5b54\") " Mar 18 16:37:09 crc kubenswrapper[4696]: I0318 16:37:09.155488 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a33478c7-526f-406d-9989-20df85ba5b54-host" (OuterVolumeSpecName: "host") pod "a33478c7-526f-406d-9989-20df85ba5b54" (UID: "a33478c7-526f-406d-9989-20df85ba5b54"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:37:09 crc kubenswrapper[4696]: I0318 16:37:09.155818 4696 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a33478c7-526f-406d-9989-20df85ba5b54-host\") on node \"crc\" DevicePath \"\"" Mar 18 16:37:09 crc kubenswrapper[4696]: I0318 16:37:09.160500 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a33478c7-526f-406d-9989-20df85ba5b54-kube-api-access-snzfc" (OuterVolumeSpecName: "kube-api-access-snzfc") pod "a33478c7-526f-406d-9989-20df85ba5b54" (UID: "a33478c7-526f-406d-9989-20df85ba5b54"). InnerVolumeSpecName "kube-api-access-snzfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:37:09 crc kubenswrapper[4696]: I0318 16:37:09.257595 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snzfc\" (UniqueName: \"kubernetes.io/projected/a33478c7-526f-406d-9989-20df85ba5b54-kube-api-access-snzfc\") on node \"crc\" DevicePath \"\"" Mar 18 16:37:09 crc kubenswrapper[4696]: I0318 16:37:09.607572 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a33478c7-526f-406d-9989-20df85ba5b54" path="/var/lib/kubelet/pods/a33478c7-526f-406d-9989-20df85ba5b54/volumes" Mar 18 16:37:09 crc kubenswrapper[4696]: I0318 16:37:09.873075 4696 scope.go:117] "RemoveContainer" containerID="e95b3d4dfa6b5c20a18833ec0fef1e4bdcc1a048e8456f6bd60e06b075c2e16e" Mar 18 16:37:09 crc kubenswrapper[4696]: I0318 16:37:09.873105 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ng8w4/crc-debug-k9plt" Mar 18 16:37:10 crc kubenswrapper[4696]: I0318 16:37:10.196659 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ng8w4/crc-debug-rc5n2"] Mar 18 16:37:10 crc kubenswrapper[4696]: E0318 16:37:10.197847 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a33478c7-526f-406d-9989-20df85ba5b54" containerName="container-00" Mar 18 16:37:10 crc kubenswrapper[4696]: I0318 16:37:10.197926 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a33478c7-526f-406d-9989-20df85ba5b54" containerName="container-00" Mar 18 16:37:10 crc kubenswrapper[4696]: I0318 16:37:10.198186 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a33478c7-526f-406d-9989-20df85ba5b54" containerName="container-00" Mar 18 16:37:10 crc kubenswrapper[4696]: I0318 16:37:10.199094 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ng8w4/crc-debug-rc5n2" Mar 18 16:37:10 crc kubenswrapper[4696]: I0318 16:37:10.376913 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktzhs\" (UniqueName: \"kubernetes.io/projected/44c79eec-ac44-45a6-bb90-161d63c388f5-kube-api-access-ktzhs\") pod \"crc-debug-rc5n2\" (UID: \"44c79eec-ac44-45a6-bb90-161d63c388f5\") " pod="openshift-must-gather-ng8w4/crc-debug-rc5n2" Mar 18 16:37:10 crc kubenswrapper[4696]: I0318 16:37:10.377310 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44c79eec-ac44-45a6-bb90-161d63c388f5-host\") pod \"crc-debug-rc5n2\" (UID: \"44c79eec-ac44-45a6-bb90-161d63c388f5\") " pod="openshift-must-gather-ng8w4/crc-debug-rc5n2" Mar 18 16:37:10 crc kubenswrapper[4696]: I0318 16:37:10.478731 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktzhs\" (UniqueName: \"kubernetes.io/projected/44c79eec-ac44-45a6-bb90-161d63c388f5-kube-api-access-ktzhs\") pod \"crc-debug-rc5n2\" (UID: \"44c79eec-ac44-45a6-bb90-161d63c388f5\") " pod="openshift-must-gather-ng8w4/crc-debug-rc5n2" Mar 18 16:37:10 crc kubenswrapper[4696]: I0318 16:37:10.478853 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44c79eec-ac44-45a6-bb90-161d63c388f5-host\") pod \"crc-debug-rc5n2\" (UID: \"44c79eec-ac44-45a6-bb90-161d63c388f5\") " pod="openshift-must-gather-ng8w4/crc-debug-rc5n2" Mar 18 16:37:10 crc kubenswrapper[4696]: I0318 16:37:10.479127 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44c79eec-ac44-45a6-bb90-161d63c388f5-host\") pod \"crc-debug-rc5n2\" (UID: \"44c79eec-ac44-45a6-bb90-161d63c388f5\") " pod="openshift-must-gather-ng8w4/crc-debug-rc5n2" Mar 18 16:37:10 crc kubenswrapper[4696]: I0318 16:37:10.500059 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktzhs\" (UniqueName: \"kubernetes.io/projected/44c79eec-ac44-45a6-bb90-161d63c388f5-kube-api-access-ktzhs\") pod \"crc-debug-rc5n2\" (UID: \"44c79eec-ac44-45a6-bb90-161d63c388f5\") " pod="openshift-must-gather-ng8w4/crc-debug-rc5n2" Mar 18 16:37:10 crc kubenswrapper[4696]: I0318 16:37:10.516080 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ng8w4/crc-debug-rc5n2" Mar 18 16:37:10 crc kubenswrapper[4696]: I0318 16:37:10.883978 4696 generic.go:334] "Generic (PLEG): container finished" podID="44c79eec-ac44-45a6-bb90-161d63c388f5" containerID="49f6a024b46e2b01ca3968182379c49730b111c05d041f990f3c2f5536ad5c28" exitCode=0 Mar 18 16:37:10 crc kubenswrapper[4696]: I0318 16:37:10.884040 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ng8w4/crc-debug-rc5n2" event={"ID":"44c79eec-ac44-45a6-bb90-161d63c388f5","Type":"ContainerDied","Data":"49f6a024b46e2b01ca3968182379c49730b111c05d041f990f3c2f5536ad5c28"} Mar 18 16:37:10 crc kubenswrapper[4696]: I0318 16:37:10.884086 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ng8w4/crc-debug-rc5n2" event={"ID":"44c79eec-ac44-45a6-bb90-161d63c388f5","Type":"ContainerStarted","Data":"b6e78c830d39e244baf2e8eca1d88e637e04bb2515dd6c75b168f916c8d48632"} Mar 18 16:37:11 crc kubenswrapper[4696]: I0318 16:37:11.303149 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ng8w4/crc-debug-rc5n2"] Mar 18 16:37:11 crc kubenswrapper[4696]: I0318 16:37:11.310838 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ng8w4/crc-debug-rc5n2"] Mar 18 16:37:11 crc kubenswrapper[4696]: I0318 16:37:11.991159 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ng8w4/crc-debug-rc5n2" Mar 18 16:37:12 crc kubenswrapper[4696]: I0318 16:37:12.113160 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44c79eec-ac44-45a6-bb90-161d63c388f5-host\") pod \"44c79eec-ac44-45a6-bb90-161d63c388f5\" (UID: \"44c79eec-ac44-45a6-bb90-161d63c388f5\") " Mar 18 16:37:12 crc kubenswrapper[4696]: I0318 16:37:12.113223 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktzhs\" (UniqueName: \"kubernetes.io/projected/44c79eec-ac44-45a6-bb90-161d63c388f5-kube-api-access-ktzhs\") pod \"44c79eec-ac44-45a6-bb90-161d63c388f5\" (UID: \"44c79eec-ac44-45a6-bb90-161d63c388f5\") " Mar 18 16:37:12 crc kubenswrapper[4696]: I0318 16:37:12.113288 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44c79eec-ac44-45a6-bb90-161d63c388f5-host" (OuterVolumeSpecName: "host") pod "44c79eec-ac44-45a6-bb90-161d63c388f5" (UID: "44c79eec-ac44-45a6-bb90-161d63c388f5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:37:12 crc kubenswrapper[4696]: I0318 16:37:12.114090 4696 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44c79eec-ac44-45a6-bb90-161d63c388f5-host\") on node \"crc\" DevicePath \"\"" Mar 18 16:37:12 crc kubenswrapper[4696]: I0318 16:37:12.118197 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44c79eec-ac44-45a6-bb90-161d63c388f5-kube-api-access-ktzhs" (OuterVolumeSpecName: "kube-api-access-ktzhs") pod "44c79eec-ac44-45a6-bb90-161d63c388f5" (UID: "44c79eec-ac44-45a6-bb90-161d63c388f5"). InnerVolumeSpecName "kube-api-access-ktzhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:37:12 crc kubenswrapper[4696]: I0318 16:37:12.184257 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:37:12 crc kubenswrapper[4696]: I0318 16:37:12.184341 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:37:12 crc kubenswrapper[4696]: I0318 16:37:12.215754 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktzhs\" (UniqueName: \"kubernetes.io/projected/44c79eec-ac44-45a6-bb90-161d63c388f5-kube-api-access-ktzhs\") on node \"crc\" DevicePath \"\"" Mar 18 16:37:12 crc kubenswrapper[4696]: I0318 16:37:12.481339 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ng8w4/crc-debug-ps6mn"] Mar 18 16:37:12 crc kubenswrapper[4696]: E0318 16:37:12.481866 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c79eec-ac44-45a6-bb90-161d63c388f5" containerName="container-00" Mar 18 16:37:12 crc kubenswrapper[4696]: I0318 16:37:12.481882 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c79eec-ac44-45a6-bb90-161d63c388f5" containerName="container-00" Mar 18 16:37:12 crc kubenswrapper[4696]: I0318 16:37:12.482138 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="44c79eec-ac44-45a6-bb90-161d63c388f5" containerName="container-00" Mar 18 16:37:12 crc kubenswrapper[4696]: I0318 16:37:12.482817 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ng8w4/crc-debug-ps6mn" Mar 18 16:37:12 crc kubenswrapper[4696]: I0318 16:37:12.519269 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-255h5\" (UniqueName: \"kubernetes.io/projected/cc0a8fc0-1222-450b-a807-b3d22ec76ad4-kube-api-access-255h5\") pod \"crc-debug-ps6mn\" (UID: \"cc0a8fc0-1222-450b-a807-b3d22ec76ad4\") " pod="openshift-must-gather-ng8w4/crc-debug-ps6mn" Mar 18 16:37:12 crc kubenswrapper[4696]: I0318 16:37:12.519345 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc0a8fc0-1222-450b-a807-b3d22ec76ad4-host\") pod \"crc-debug-ps6mn\" (UID: \"cc0a8fc0-1222-450b-a807-b3d22ec76ad4\") " pod="openshift-must-gather-ng8w4/crc-debug-ps6mn" Mar 18 16:37:12 crc kubenswrapper[4696]: I0318 16:37:12.622768 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-255h5\" (UniqueName: \"kubernetes.io/projected/cc0a8fc0-1222-450b-a807-b3d22ec76ad4-kube-api-access-255h5\") pod \"crc-debug-ps6mn\" (UID: \"cc0a8fc0-1222-450b-a807-b3d22ec76ad4\") " pod="openshift-must-gather-ng8w4/crc-debug-ps6mn" Mar 18 16:37:12 crc kubenswrapper[4696]: I0318 16:37:12.622829 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc0a8fc0-1222-450b-a807-b3d22ec76ad4-host\") pod \"crc-debug-ps6mn\" (UID: \"cc0a8fc0-1222-450b-a807-b3d22ec76ad4\") " pod="openshift-must-gather-ng8w4/crc-debug-ps6mn" Mar 18 16:37:12 crc kubenswrapper[4696]: I0318 16:37:12.622967 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc0a8fc0-1222-450b-a807-b3d22ec76ad4-host\") pod \"crc-debug-ps6mn\" (UID: \"cc0a8fc0-1222-450b-a807-b3d22ec76ad4\") " pod="openshift-must-gather-ng8w4/crc-debug-ps6mn" Mar 18 16:37:12 crc kubenswrapper[4696]: I0318 16:37:12.651500 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-255h5\" (UniqueName: \"kubernetes.io/projected/cc0a8fc0-1222-450b-a807-b3d22ec76ad4-kube-api-access-255h5\") pod \"crc-debug-ps6mn\" (UID: \"cc0a8fc0-1222-450b-a807-b3d22ec76ad4\") " pod="openshift-must-gather-ng8w4/crc-debug-ps6mn" Mar 18 16:37:12 crc kubenswrapper[4696]: I0318 16:37:12.816939 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ng8w4/crc-debug-ps6mn" Mar 18 16:37:12 crc kubenswrapper[4696]: I0318 16:37:12.909390 4696 scope.go:117] "RemoveContainer" containerID="49f6a024b46e2b01ca3968182379c49730b111c05d041f990f3c2f5536ad5c28" Mar 18 16:37:12 crc kubenswrapper[4696]: I0318 16:37:12.909571 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ng8w4/crc-debug-rc5n2" Mar 18 16:37:12 crc kubenswrapper[4696]: I0318 16:37:12.913920 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ng8w4/crc-debug-ps6mn" event={"ID":"cc0a8fc0-1222-450b-a807-b3d22ec76ad4","Type":"ContainerStarted","Data":"70a3cad46f8b55c1b8c8103eb4df0ff6d4f569ad2bbda56d08ce1adb5ad070d3"} Mar 18 16:37:13 crc kubenswrapper[4696]: I0318 16:37:13.608500 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44c79eec-ac44-45a6-bb90-161d63c388f5" path="/var/lib/kubelet/pods/44c79eec-ac44-45a6-bb90-161d63c388f5/volumes" Mar 18 16:37:13 crc kubenswrapper[4696]: I0318 16:37:13.923379 4696 generic.go:334] "Generic (PLEG): container finished" podID="cc0a8fc0-1222-450b-a807-b3d22ec76ad4" containerID="b39d08a83d2e98389dc42ba089d31bd42345481fb76176e0c1ea55655bbd5a77" exitCode=0 Mar 18 16:37:13 crc kubenswrapper[4696]: I0318 16:37:13.923452 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ng8w4/crc-debug-ps6mn" event={"ID":"cc0a8fc0-1222-450b-a807-b3d22ec76ad4","Type":"ContainerDied","Data":"b39d08a83d2e98389dc42ba089d31bd42345481fb76176e0c1ea55655bbd5a77"} Mar 18 16:37:13 crc kubenswrapper[4696]: I0318 16:37:13.959099 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ng8w4/crc-debug-ps6mn"] Mar 18 16:37:13 crc kubenswrapper[4696]: I0318 16:37:13.966736 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ng8w4/crc-debug-ps6mn"] Mar 18 16:37:15 crc kubenswrapper[4696]: I0318 16:37:15.040591 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ng8w4/crc-debug-ps6mn" Mar 18 16:37:15 crc kubenswrapper[4696]: I0318 16:37:15.171741 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc0a8fc0-1222-450b-a807-b3d22ec76ad4-host\") pod \"cc0a8fc0-1222-450b-a807-b3d22ec76ad4\" (UID: \"cc0a8fc0-1222-450b-a807-b3d22ec76ad4\") " Mar 18 16:37:15 crc kubenswrapper[4696]: I0318 16:37:15.171840 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc0a8fc0-1222-450b-a807-b3d22ec76ad4-host" (OuterVolumeSpecName: "host") pod "cc0a8fc0-1222-450b-a807-b3d22ec76ad4" (UID: "cc0a8fc0-1222-450b-a807-b3d22ec76ad4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:37:15 crc kubenswrapper[4696]: I0318 16:37:15.171891 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-255h5\" (UniqueName: \"kubernetes.io/projected/cc0a8fc0-1222-450b-a807-b3d22ec76ad4-kube-api-access-255h5\") pod \"cc0a8fc0-1222-450b-a807-b3d22ec76ad4\" (UID: \"cc0a8fc0-1222-450b-a807-b3d22ec76ad4\") " Mar 18 16:37:15 crc kubenswrapper[4696]: I0318 16:37:15.172556 4696 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc0a8fc0-1222-450b-a807-b3d22ec76ad4-host\") on node \"crc\" DevicePath \"\"" Mar 18 16:37:15 crc kubenswrapper[4696]: I0318 16:37:15.181706 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc0a8fc0-1222-450b-a807-b3d22ec76ad4-kube-api-access-255h5" (OuterVolumeSpecName: "kube-api-access-255h5") pod "cc0a8fc0-1222-450b-a807-b3d22ec76ad4" (UID: "cc0a8fc0-1222-450b-a807-b3d22ec76ad4"). InnerVolumeSpecName "kube-api-access-255h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:37:15 crc kubenswrapper[4696]: I0318 16:37:15.274428 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-255h5\" (UniqueName: \"kubernetes.io/projected/cc0a8fc0-1222-450b-a807-b3d22ec76ad4-kube-api-access-255h5\") on node \"crc\" DevicePath \"\"" Mar 18 16:37:15 crc kubenswrapper[4696]: I0318 16:37:15.610185 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc0a8fc0-1222-450b-a807-b3d22ec76ad4" path="/var/lib/kubelet/pods/cc0a8fc0-1222-450b-a807-b3d22ec76ad4/volumes" Mar 18 16:37:15 crc kubenswrapper[4696]: I0318 16:37:15.943097 4696 scope.go:117] "RemoveContainer" containerID="b39d08a83d2e98389dc42ba089d31bd42345481fb76176e0c1ea55655bbd5a77" Mar 18 16:37:15 crc kubenswrapper[4696]: I0318 16:37:15.943134 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ng8w4/crc-debug-ps6mn" Mar 18 16:37:29 crc kubenswrapper[4696]: I0318 16:37:29.827910 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-756f5c8c54-wjfvb_0d221056-d9f9-47b1-9871-65a83cd55cb4/barbican-api/0.log" Mar 18 16:37:29 crc kubenswrapper[4696]: I0318 16:37:29.925384 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-756f5c8c54-wjfvb_0d221056-d9f9-47b1-9871-65a83cd55cb4/barbican-api-log/0.log" Mar 18 16:37:29 crc kubenswrapper[4696]: I0318 16:37:29.996930 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-745b9b4c58-ztgcm_07475c5d-ee2a-407e-986d-245ada3da65c/barbican-keystone-listener/0.log" Mar 18 16:37:30 crc kubenswrapper[4696]: I0318 16:37:30.061500 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-745b9b4c58-ztgcm_07475c5d-ee2a-407e-986d-245ada3da65c/barbican-keystone-listener-log/0.log" Mar 18 16:37:30 crc kubenswrapper[4696]: I0318 16:37:30.213485 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5d98fdb45f-pcbp7_48631acd-5b2b-48d2-9386-6e023de39655/barbican-worker/0.log" Mar 18 16:37:30 crc kubenswrapper[4696]: I0318 16:37:30.243492 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5d98fdb45f-pcbp7_48631acd-5b2b-48d2-9386-6e023de39655/barbican-worker-log/0.log" Mar 18 16:37:30 crc kubenswrapper[4696]: I0318 16:37:30.480125 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh_1f8feb1b-5d39-4cb7-996f-dc5e34065193/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:37:30 crc kubenswrapper[4696]: I0318 16:37:30.483712 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3/ceilometer-central-agent/0.log" Mar 18 16:37:30 crc kubenswrapper[4696]: I0318 16:37:30.598014 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3/ceilometer-notification-agent/0.log" Mar 18 16:37:30 crc kubenswrapper[4696]: I0318 16:37:30.666367 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3/proxy-httpd/0.log" Mar 18 16:37:30 crc kubenswrapper[4696]: I0318 16:37:30.693693 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3/sg-core/0.log" Mar 18 16:37:30 crc kubenswrapper[4696]: I0318 16:37:30.843046 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ce02fef3-40fa-46fe-a496-0aada019e24b/cinder-api-log/0.log" Mar 18 16:37:30 crc kubenswrapper[4696]: I0318 16:37:30.874860 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ce02fef3-40fa-46fe-a496-0aada019e24b/cinder-api/0.log" Mar 18 16:37:31 crc kubenswrapper[4696]: I0318 16:37:31.043189 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e2ee43b8-090b-4daf-907b-9a21c3986e42/cinder-scheduler/0.log" Mar 18 16:37:31 crc kubenswrapper[4696]: I0318 16:37:31.117419 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e2ee43b8-090b-4daf-907b-9a21c3986e42/probe/0.log" Mar 18 16:37:31 crc kubenswrapper[4696]: I0318 16:37:31.268973 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb_f2c235ca-a193-47df-8495-600e7c8eea37/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:37:31 crc kubenswrapper[4696]: I0318 16:37:31.395747 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7_3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:37:31 crc kubenswrapper[4696]: I0318 16:37:31.648185 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-kff5g_b04fc1e7-0f41-46df-90ac-71d0b7d4e29d/init/0.log" Mar 18 16:37:31 crc kubenswrapper[4696]: I0318 16:37:31.841459 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-kff5g_b04fc1e7-0f41-46df-90ac-71d0b7d4e29d/init/0.log" Mar 18 16:37:31 crc kubenswrapper[4696]: I0318 16:37:31.923538 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-kff5g_b04fc1e7-0f41-46df-90ac-71d0b7d4e29d/dnsmasq-dns/0.log" Mar 18 16:37:31 crc kubenswrapper[4696]: I0318 16:37:31.954384 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4_c9ae5aa3-8f8f-4951-85ec-1b3583c90481/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:37:32 crc kubenswrapper[4696]: I0318 16:37:32.165980 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_9351c230-91b7-40c0-afbc-8adad7604ad4/glance-httpd/0.log" Mar 18 16:37:32 crc kubenswrapper[4696]: I0318 16:37:32.192083 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_9351c230-91b7-40c0-afbc-8adad7604ad4/glance-log/0.log" Mar 18 16:37:32 crc kubenswrapper[4696]: I0318 16:37:32.301018 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_1f1e526e-e856-452e-8fc6-26663ca20e4a/glance-httpd/0.log" Mar 18 16:37:32 crc kubenswrapper[4696]: I0318 16:37:32.363753 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_1f1e526e-e856-452e-8fc6-26663ca20e4a/glance-log/0.log" Mar 18 16:37:32 crc kubenswrapper[4696]: I0318 16:37:32.529883 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-59764c649b-dxxpb_abd090d6-037c-4cc7-907a-43293ce636ff/horizon/0.log" Mar 18 16:37:32 crc kubenswrapper[4696]: I0318 16:37:32.817899 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-r27gg_f5a70cb2-3b7d-43ab-9ab6-c154a737db7d/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:37:32 crc kubenswrapper[4696]: I0318 16:37:32.857833 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-59764c649b-dxxpb_abd090d6-037c-4cc7-907a-43293ce636ff/horizon-log/0.log" Mar 18 16:37:33 crc kubenswrapper[4696]: I0318 16:37:33.223992 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-8qtfd_aa8fa732-917d-4782-aa47-b1846179b603/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:37:33 crc kubenswrapper[4696]: I0318 16:37:33.297977 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-b5955bfd6-zmfrz_811e96fe-c7fe-424f-b86f-043aaa273d62/keystone-api/0.log" Mar 18 16:37:33 crc kubenswrapper[4696]: I0318 16:37:33.356045 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29564161-mrc5s_e8aacc65-eb04-4cb3-8ab2-fb34b6769db4/keystone-cron/0.log" Mar 18 16:37:33 crc kubenswrapper[4696]: I0318 16:37:33.463249 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6b790582-ecd0-41b7-8f9c-f0ef9d2415db/kube-state-metrics/0.log" Mar 18 16:37:34 crc kubenswrapper[4696]: I0318 16:37:34.082368 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p_ded21247-5107-45ab-9b12-25cb76cdfda3/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:37:34 crc kubenswrapper[4696]: I0318 16:37:34.224083 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-667fb94989-br52g_13f1595d-6eb1-41a2-8cd9-12d80a38303f/neutron-api/0.log" Mar 18 16:37:34 crc kubenswrapper[4696]: I0318 16:37:34.232701 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-667fb94989-br52g_13f1595d-6eb1-41a2-8cd9-12d80a38303f/neutron-httpd/0.log" Mar 18 16:37:34 crc kubenswrapper[4696]: I0318 16:37:34.431437 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7_a50d071c-8a54-4335-be8c-1842e52dcb81/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:37:34 crc kubenswrapper[4696]: I0318 16:37:34.885164 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_449c7dea-20e2-4b99-bec6-e3287082418a/nova-api-log/0.log" Mar 18 16:37:34 crc kubenswrapper[4696]: I0318 16:37:34.911703 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_f6247665-ab0d-4101-acfd-c3da0f598788/nova-cell0-conductor-conductor/0.log" Mar 18 16:37:35 crc kubenswrapper[4696]: I0318 16:37:35.185442 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_449c7dea-20e2-4b99-bec6-e3287082418a/nova-api-api/0.log" Mar 18 16:37:35 crc kubenswrapper[4696]: I0318 16:37:35.208383 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_69cc8eab-a88e-49ce-830b-9e352aea0d5f/nova-cell1-conductor-conductor/0.log" Mar 18 16:37:35 crc kubenswrapper[4696]: I0318 16:37:35.289390 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_21cc776b-31bc-469a-9b50-930b0480541d/nova-cell1-novncproxy-novncproxy/0.log" Mar 18 16:37:35 crc kubenswrapper[4696]: I0318 16:37:35.711789 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c03e61df-341f-42de-8682-c17255ffedcb/nova-metadata-log/0.log" Mar 18 16:37:36 crc kubenswrapper[4696]: I0318 16:37:36.058930 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_f49d5ebf-6c40-4bb4-bc0a-0ce72839c86a/nova-scheduler-scheduler/0.log" Mar 18 16:37:36 crc kubenswrapper[4696]: I0318 16:37:36.076944 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c03e61df-341f-42de-8682-c17255ffedcb/nova-metadata-metadata/0.log" Mar 18 16:37:36 crc kubenswrapper[4696]: I0318 16:37:36.100904 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-r4dhh_57f3ea1b-d23e-435c-826f-539c401753be/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:37:36 crc kubenswrapper[4696]: I0318 16:37:36.226795 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_87f019c6-a59d-4465-8fb8-c47b198c513b/mysql-bootstrap/0.log" Mar 18 16:37:36 crc kubenswrapper[4696]: I0318 16:37:36.484269 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_87f019c6-a59d-4465-8fb8-c47b198c513b/galera/0.log" Mar 18 16:37:36 crc kubenswrapper[4696]: I0318 16:37:36.581943 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_87f019c6-a59d-4465-8fb8-c47b198c513b/mysql-bootstrap/0.log" Mar 18 16:37:36 crc kubenswrapper[4696]: I0318 16:37:36.617031 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_dd1c1de5-fda6-4306-bde0-736fd76a8f31/mysql-bootstrap/0.log" Mar 18 16:37:36 crc kubenswrapper[4696]: I0318 16:37:36.750148 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_dd1c1de5-fda6-4306-bde0-736fd76a8f31/mysql-bootstrap/0.log" Mar 18 16:37:36 crc kubenswrapper[4696]: I0318 16:37:36.752013 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_dd1c1de5-fda6-4306-bde0-736fd76a8f31/galera/0.log" Mar 18 16:37:36 crc kubenswrapper[4696]: I0318 16:37:36.866264 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e/openstackclient/0.log" Mar 18 16:37:36 crc kubenswrapper[4696]: I0318 16:37:36.998428 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-hb4gt_0b77b78e-7226-4d19-a9b7-190ad5248eb7/openstack-network-exporter/0.log" Mar 18 16:37:37 crc kubenswrapper[4696]: I0318 16:37:37.126603 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-x4bkz_a37c7e7f-336d-4e95-b9ea-3750b49d4117/ovsdb-server-init/0.log" Mar 18 16:37:37 crc kubenswrapper[4696]: I0318 16:37:37.360444 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-x4bkz_a37c7e7f-336d-4e95-b9ea-3750b49d4117/ovs-vswitchd/0.log" Mar 18 16:37:37 crc kubenswrapper[4696]: I0318 16:37:37.360727 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-x4bkz_a37c7e7f-336d-4e95-b9ea-3750b49d4117/ovsdb-server-init/0.log" Mar 18 16:37:37 crc kubenswrapper[4696]: I0318 16:37:37.417144 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-x4bkz_a37c7e7f-336d-4e95-b9ea-3750b49d4117/ovsdb-server/0.log" Mar 18 16:37:37 crc kubenswrapper[4696]: I0318 16:37:37.543061 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-vb7xn_efa7f696-eda9-4cd4-953b-0a24e9935290/ovn-controller/0.log" Mar 18 16:37:37 crc kubenswrapper[4696]: I0318 16:37:37.733019 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-gsd7h_8ac2ae34-5ffd-4557-96ae-c4d268e2cf73/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:37:37 crc kubenswrapper[4696]: I0318 16:37:37.747345 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6d256733-b9f7-484d-873a-b77e062f63c8/openstack-network-exporter/0.log" Mar 18 16:37:37 crc kubenswrapper[4696]: I0318 16:37:37.937320 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6d256733-b9f7-484d-873a-b77e062f63c8/ovn-northd/0.log" Mar 18 16:37:37 crc kubenswrapper[4696]: I0318 16:37:37.985501 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_50019b99-f0df-4582-ab2a-49f761bc0aa7/openstack-network-exporter/0.log" Mar 18 16:37:38 crc kubenswrapper[4696]: I0318 16:37:38.093605 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_50019b99-f0df-4582-ab2a-49f761bc0aa7/ovsdbserver-nb/0.log" Mar 18 16:37:38 crc kubenswrapper[4696]: I0318 16:37:38.184888 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8df5e2e0-02fe-4be7-ae7d-f92ea79ce510/openstack-network-exporter/0.log" Mar 18 16:37:38 crc kubenswrapper[4696]: I0318 16:37:38.237560 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8df5e2e0-02fe-4be7-ae7d-f92ea79ce510/ovsdbserver-sb/0.log" Mar 18 16:37:38 crc kubenswrapper[4696]: I0318 16:37:38.491295 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9749b5588-6wsv8_1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f/placement-api/0.log" Mar 18 16:37:38 crc kubenswrapper[4696]: I0318 16:37:38.568580 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9749b5588-6wsv8_1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f/placement-log/0.log" Mar 18 16:37:38 crc kubenswrapper[4696]: I0318 16:37:38.579215 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ad207e86-aeb6-4af2-a411-dee8342b4fe9/setup-container/0.log" Mar 18 16:37:38 crc kubenswrapper[4696]: I0318 16:37:38.736613 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ad207e86-aeb6-4af2-a411-dee8342b4fe9/setup-container/0.log" Mar 18 16:37:38 crc kubenswrapper[4696]: I0318 16:37:38.827550 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ad207e86-aeb6-4af2-a411-dee8342b4fe9/rabbitmq/0.log" Mar 18 16:37:38 crc kubenswrapper[4696]: I0318 16:37:38.875289 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8db68e71-4312-400b-8575-06f87bf6a781/setup-container/0.log" Mar 18 16:37:38 crc kubenswrapper[4696]: I0318 16:37:38.992366 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8db68e71-4312-400b-8575-06f87bf6a781/setup-container/0.log" Mar 18 16:37:39 crc kubenswrapper[4696]: I0318 16:37:39.005796 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8db68e71-4312-400b-8575-06f87bf6a781/rabbitmq/0.log" Mar 18 16:37:39 crc kubenswrapper[4696]: I0318 16:37:39.061745 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg_43fbb202-ffe4-40ba-b61e-ea284e533c1f/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:37:39 crc kubenswrapper[4696]: I0318 16:37:39.234682 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-vxq5r_ae98a130-1216-4906-8e7b-3721a2857935/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:37:39 crc kubenswrapper[4696]: I0318 16:37:39.361818 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs_0489a724-0e24-4090-afc8-8d7baec47630/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:37:39 crc kubenswrapper[4696]: I0318 16:37:39.569511 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-rshr7_174da10e-47cb-4e7a-8226-e7a4baeaf2ac/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:37:39 crc kubenswrapper[4696]: I0318 16:37:39.604674 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-v47nc_3460c0d4-77fe-49fd-a525-52b831bf4ff6/ssh-known-hosts-edpm-deployment/0.log" Mar 18 16:37:39 crc kubenswrapper[4696]: I0318 16:37:39.797216 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6847f4969-jlnz4_2418339a-4137-4f64-b098-f0e5011d3f61/proxy-server/0.log" Mar 18 16:37:39 crc kubenswrapper[4696]: I0318 16:37:39.969943 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-xj2ch_62b4b14f-0ab3-4906-9c97-8c3092cd5379/swift-ring-rebalance/0.log" Mar 18 16:37:39 crc kubenswrapper[4696]: I0318 16:37:39.978305 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6847f4969-jlnz4_2418339a-4137-4f64-b098-f0e5011d3f61/proxy-httpd/0.log" Mar 18 16:37:40 crc kubenswrapper[4696]: I0318 16:37:40.075720 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/account-auditor/0.log" Mar 18 16:37:40 crc kubenswrapper[4696]: I0318 16:37:40.171930 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/account-reaper/0.log" Mar 18 16:37:40 crc kubenswrapper[4696]: I0318 16:37:40.256723 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/account-replicator/0.log" Mar 18 16:37:40 crc kubenswrapper[4696]: I0318 16:37:40.289126 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/account-server/0.log" Mar 18 16:37:40 crc kubenswrapper[4696]: I0318 16:37:40.291950 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/container-auditor/0.log" Mar 18 16:37:40 crc kubenswrapper[4696]: I0318 16:37:40.415748 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/container-replicator/0.log" Mar 18 16:37:40 crc kubenswrapper[4696]: I0318 16:37:40.500481 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/container-updater/0.log" Mar 18 16:37:40 crc kubenswrapper[4696]: I0318 16:37:40.510878 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/object-auditor/0.log" Mar 18 16:37:40 crc kubenswrapper[4696]: I0318 16:37:40.513373 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/container-server/0.log" Mar 18 16:37:40 crc kubenswrapper[4696]: I0318 16:37:40.640327 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/object-expirer/0.log" Mar 18 16:37:40 crc kubenswrapper[4696]: I0318 16:37:40.688900 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/object-updater/0.log" Mar 18 16:37:40 crc kubenswrapper[4696]: I0318 16:37:40.732702 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/object-replicator/0.log" Mar 18 16:37:40 crc kubenswrapper[4696]: I0318 16:37:40.764631 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/object-server/0.log" Mar 18 16:37:40 crc kubenswrapper[4696]: I0318 16:37:40.863757 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/rsync/0.log" Mar 18 16:37:40 crc kubenswrapper[4696]: I0318 16:37:40.900658 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/swift-recon-cron/0.log" Mar 18 16:37:41 crc kubenswrapper[4696]: I0318 16:37:41.155158 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_0b6d2f26-746f-404e-817e-ca3b65cc9511/tempest-tests-tempest-tests-runner/0.log" Mar 18 16:37:41 crc kubenswrapper[4696]: I0318 16:37:41.331937 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_3166f000-157f-4751-a28c-a2e5f5caa4d9/test-operator-logs-container/0.log" Mar 18 16:37:41 crc kubenswrapper[4696]: I0318 16:37:41.477571 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-zxchz_ffcf2496-8e16-4355-863a-7cad2e2357fe/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:37:41 crc kubenswrapper[4696]: I0318 16:37:41.522977 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc_9c5cf28b-0e58-48d1-bd91-2a403201c425/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:37:42 crc kubenswrapper[4696]: I0318 16:37:42.184350 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:37:42 crc kubenswrapper[4696]: I0318 16:37:42.184421 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:37:49 crc kubenswrapper[4696]: I0318 16:37:49.684401 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_88bcbc43-a512-4f0f-8ce6-e6fd9905df8b/memcached/0.log" Mar 18 16:38:00 crc kubenswrapper[4696]: I0318 16:38:00.143626 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564198-nglr8"] Mar 18 16:38:00 crc kubenswrapper[4696]: E0318 16:38:00.148677 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc0a8fc0-1222-450b-a807-b3d22ec76ad4" containerName="container-00" Mar 18 16:38:00 crc kubenswrapper[4696]: I0318 16:38:00.148717 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc0a8fc0-1222-450b-a807-b3d22ec76ad4" containerName="container-00" Mar 18 16:38:00 crc kubenswrapper[4696]: I0318 16:38:00.149085 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc0a8fc0-1222-450b-a807-b3d22ec76ad4" containerName="container-00" Mar 18 16:38:00 crc kubenswrapper[4696]: I0318 16:38:00.149976 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564198-nglr8" Mar 18 16:38:00 crc kubenswrapper[4696]: I0318 16:38:00.152154 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:38:00 crc kubenswrapper[4696]: I0318 16:38:00.152202 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:38:00 crc kubenswrapper[4696]: I0318 16:38:00.152381 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:38:00 crc kubenswrapper[4696]: I0318 16:38:00.157603 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564198-nglr8"] Mar 18 16:38:00 crc kubenswrapper[4696]: I0318 16:38:00.205721 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8zww\" (UniqueName: \"kubernetes.io/projected/7c1dc242-7619-477c-8bbf-c5b6cc2c7805-kube-api-access-r8zww\") pod \"auto-csr-approver-29564198-nglr8\" (UID: \"7c1dc242-7619-477c-8bbf-c5b6cc2c7805\") " pod="openshift-infra/auto-csr-approver-29564198-nglr8" Mar 18 16:38:00 crc kubenswrapper[4696]: I0318 16:38:00.307571 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8zww\" (UniqueName: \"kubernetes.io/projected/7c1dc242-7619-477c-8bbf-c5b6cc2c7805-kube-api-access-r8zww\") pod \"auto-csr-approver-29564198-nglr8\" (UID: \"7c1dc242-7619-477c-8bbf-c5b6cc2c7805\") " pod="openshift-infra/auto-csr-approver-29564198-nglr8" Mar 18 16:38:00 crc kubenswrapper[4696]: I0318 16:38:00.329353 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8zww\" (UniqueName: \"kubernetes.io/projected/7c1dc242-7619-477c-8bbf-c5b6cc2c7805-kube-api-access-r8zww\") pod \"auto-csr-approver-29564198-nglr8\" (UID: \"7c1dc242-7619-477c-8bbf-c5b6cc2c7805\") " pod="openshift-infra/auto-csr-approver-29564198-nglr8" Mar 18 16:38:00 crc kubenswrapper[4696]: I0318 16:38:00.471464 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564198-nglr8" Mar 18 16:38:00 crc kubenswrapper[4696]: I0318 16:38:00.929697 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564198-nglr8"] Mar 18 16:38:01 crc kubenswrapper[4696]: I0318 16:38:01.390496 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564198-nglr8" event={"ID":"7c1dc242-7619-477c-8bbf-c5b6cc2c7805","Type":"ContainerStarted","Data":"bce3e6d522088782d64cd2ae017fad1253331349c3d8091344f8df8ed71b72af"} Mar 18 16:38:03 crc kubenswrapper[4696]: I0318 16:38:03.409504 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564198-nglr8" event={"ID":"7c1dc242-7619-477c-8bbf-c5b6cc2c7805","Type":"ContainerStarted","Data":"d55797574f830c287dc9c3b1d3de8bab805a740b5af4d3557869dfa49c1ffc10"} Mar 18 16:38:03 crc kubenswrapper[4696]: I0318 16:38:03.427778 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564198-nglr8" podStartSLOduration=1.304762497 podStartE2EDuration="3.427759261s" podCreationTimestamp="2026-03-18 16:38:00 +0000 UTC" firstStartedPulling="2026-03-18 16:38:00.937168284 +0000 UTC m=+3723.943342490" lastFinishedPulling="2026-03-18 16:38:03.060165048 +0000 UTC m=+3726.066339254" observedRunningTime="2026-03-18 16:38:03.421512794 +0000 UTC m=+3726.427687010" watchObservedRunningTime="2026-03-18 16:38:03.427759261 +0000 UTC m=+3726.433933467" Mar 18 16:38:03 crc kubenswrapper[4696]: E0318 16:38:03.743759 4696 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c1dc242_7619_477c_8bbf_c5b6cc2c7805.slice/crio-conmon-d55797574f830c287dc9c3b1d3de8bab805a740b5af4d3557869dfa49c1ffc10.scope\": RecentStats: unable to find data in memory cache]" Mar 18 16:38:04 crc kubenswrapper[4696]: I0318 16:38:04.420475 4696 generic.go:334] "Generic (PLEG): container finished" podID="7c1dc242-7619-477c-8bbf-c5b6cc2c7805" containerID="d55797574f830c287dc9c3b1d3de8bab805a740b5af4d3557869dfa49c1ffc10" exitCode=0 Mar 18 16:38:04 crc kubenswrapper[4696]: I0318 16:38:04.420547 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564198-nglr8" event={"ID":"7c1dc242-7619-477c-8bbf-c5b6cc2c7805","Type":"ContainerDied","Data":"d55797574f830c287dc9c3b1d3de8bab805a740b5af4d3557869dfa49c1ffc10"} Mar 18 16:38:05 crc kubenswrapper[4696]: I0318 16:38:05.774199 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564198-nglr8" Mar 18 16:38:05 crc kubenswrapper[4696]: I0318 16:38:05.853841 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8zww\" (UniqueName: \"kubernetes.io/projected/7c1dc242-7619-477c-8bbf-c5b6cc2c7805-kube-api-access-r8zww\") pod \"7c1dc242-7619-477c-8bbf-c5b6cc2c7805\" (UID: \"7c1dc242-7619-477c-8bbf-c5b6cc2c7805\") " Mar 18 16:38:05 crc kubenswrapper[4696]: I0318 16:38:05.865892 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c1dc242-7619-477c-8bbf-c5b6cc2c7805-kube-api-access-r8zww" (OuterVolumeSpecName: "kube-api-access-r8zww") pod "7c1dc242-7619-477c-8bbf-c5b6cc2c7805" (UID: "7c1dc242-7619-477c-8bbf-c5b6cc2c7805"). InnerVolumeSpecName "kube-api-access-r8zww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:38:05 crc kubenswrapper[4696]: I0318 16:38:05.907342 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45_b2fcb79b-86ba-4c55-babe-94a5edc318fd/util/0.log" Mar 18 16:38:05 crc kubenswrapper[4696]: I0318 16:38:05.956008 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45_b2fcb79b-86ba-4c55-babe-94a5edc318fd/util/0.log" Mar 18 16:38:05 crc kubenswrapper[4696]: I0318 16:38:05.956414 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8zww\" (UniqueName: \"kubernetes.io/projected/7c1dc242-7619-477c-8bbf-c5b6cc2c7805-kube-api-access-r8zww\") on node \"crc\" DevicePath \"\"" Mar 18 16:38:06 crc kubenswrapper[4696]: I0318 16:38:06.042264 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45_b2fcb79b-86ba-4c55-babe-94a5edc318fd/pull/0.log" Mar 18 16:38:06 crc kubenswrapper[4696]: I0318 16:38:06.139942 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45_b2fcb79b-86ba-4c55-babe-94a5edc318fd/pull/0.log" Mar 18 16:38:06 crc kubenswrapper[4696]: I0318 16:38:06.296379 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45_b2fcb79b-86ba-4c55-babe-94a5edc318fd/pull/0.log" Mar 18 16:38:06 crc kubenswrapper[4696]: I0318 16:38:06.326399 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45_b2fcb79b-86ba-4c55-babe-94a5edc318fd/util/0.log" Mar 18 16:38:06 crc kubenswrapper[4696]: I0318 16:38:06.335231 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45_b2fcb79b-86ba-4c55-babe-94a5edc318fd/extract/0.log" Mar 18 16:38:06 crc kubenswrapper[4696]: I0318 16:38:06.439372 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564198-nglr8" event={"ID":"7c1dc242-7619-477c-8bbf-c5b6cc2c7805","Type":"ContainerDied","Data":"bce3e6d522088782d64cd2ae017fad1253331349c3d8091344f8df8ed71b72af"} Mar 18 16:38:06 crc kubenswrapper[4696]: I0318 16:38:06.439421 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bce3e6d522088782d64cd2ae017fad1253331349c3d8091344f8df8ed71b72af" Mar 18 16:38:06 crc kubenswrapper[4696]: I0318 16:38:06.439493 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564198-nglr8" Mar 18 16:38:06 crc kubenswrapper[4696]: I0318 16:38:06.508190 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564192-25g4q"] Mar 18 16:38:06 crc kubenswrapper[4696]: I0318 16:38:06.518659 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564192-25g4q"] Mar 18 16:38:06 crc kubenswrapper[4696]: I0318 16:38:06.963288 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6cc65c69fc-r4qqr_00c9dc5c-b65b-4f58-8ef8-06ebc3ec0aac/manager/0.log" Mar 18 16:38:07 crc kubenswrapper[4696]: I0318 16:38:07.250602 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7d559dcdbd-tpb84_123177a5-da82-4485-990a-d5ced4dbf8ca/manager/0.log" Mar 18 16:38:07 crc kubenswrapper[4696]: I0318 16:38:07.452018 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-66dd9d474d-dfclz_1edbcf2a-3a8d-4d2a-8c28-8fe7b66bd7be/manager/0.log" Mar 18 16:38:07 crc kubenswrapper[4696]: I0318 16:38:07.625411 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e17e10e7-6226-40f5-a445-4d73fb676335" path="/var/lib/kubelet/pods/e17e10e7-6226-40f5-a445-4d73fb676335/volumes" Mar 18 16:38:07 crc kubenswrapper[4696]: I0318 16:38:07.853585 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6d77645966-7s46n_789669f2-e26b-4de8-ad21-801820b5806b/manager/0.log" Mar 18 16:38:07 crc kubenswrapper[4696]: I0318 16:38:07.922543 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-64dc66d669-kfqqk_78640d4c-766f-4fd8-ab5f-54687b6fb5c6/manager/0.log" Mar 18 16:38:08 crc kubenswrapper[4696]: I0318 16:38:08.185799 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6b77b7676d-5nkd5_e915aebf-c140-44ee-90b8-ce169df57fd9/manager/0.log" Mar 18 16:38:08 crc kubenswrapper[4696]: I0318 16:38:08.505791 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5595c7d6ff-gggxc_e5ef6f08-4538-435c-b5c8-42bac561d200/manager/0.log" Mar 18 16:38:08 crc kubenswrapper[4696]: I0318 16:38:08.509620 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-76b87776c9-5s8hj_61d78ab1-c6d3-4dfa-a630-5ccddfd04a0f/manager/0.log" Mar 18 16:38:08 crc kubenswrapper[4696]: I0318 16:38:08.760931 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-fbf7bbb96-v85hd_13174b57-caf5-46f2-8605-51e4de880253/manager/0.log" Mar 18 16:38:08 crc kubenswrapper[4696]: I0318 16:38:08.888533 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6f5b7bcd4-gm92k_411ef48e-d8ac-471f-9018-ee5fd534a4c9/manager/0.log" Mar 18 16:38:09 crc kubenswrapper[4696]: I0318 16:38:09.038940 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6744dd545c-crpf5_e74d1820-3e14-431f-866b-b0ab8b97f20f/manager/0.log" Mar 18 16:38:09 crc kubenswrapper[4696]: I0318 16:38:09.346446 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-bc5c78db9-blz69_e7082f0a-1b24-4fda-b9b2-eb957c569232/manager/0.log" Mar 18 16:38:09 crc kubenswrapper[4696]: I0318 16:38:09.413493 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-56f74467c6-z87fb_afdee753-15ca-42fe-8cc1-937b42d07b85/manager/0.log" Mar 18 16:38:09 crc kubenswrapper[4696]: I0318 16:38:09.579788 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj_0eacab42-0fe3-4d23-b00c-81353faa98f8/manager/0.log" Mar 18 16:38:10 crc kubenswrapper[4696]: I0318 16:38:10.062279 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7bc867c5bc-c7qv6_2a8d5ad7-4bd8-4fc0-864c-9f8da421ff0e/operator/0.log" Mar 18 16:38:10 crc kubenswrapper[4696]: I0318 16:38:10.362054 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-96vqk_d0469dc6-0b23-4d52-9d1a-ed5f1e2cb83d/registry-server/0.log" Mar 18 16:38:10 crc kubenswrapper[4696]: I0318 16:38:10.606394 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-846c4cdcb7-bn6ct_9597433a-1cf7-4455-8aa6-8709fef284dd/manager/0.log" Mar 18 16:38:10 crc kubenswrapper[4696]: I0318 16:38:10.754933 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-659fb58c6b-sbh54_48afb8e4-ed3d-4c76-9be0-15279dda8889/manager/0.log" Mar 18 16:38:10 crc kubenswrapper[4696]: I0318 16:38:10.871804 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-9rttk_b0eb1bc0-9e8c-4836-b7e4-32be5e48bed4/operator/0.log" Mar 18 16:38:11 crc kubenswrapper[4696]: I0318 16:38:11.116770 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-867f54bc44-wf58k_288ae45d-6c8b-4034-8e4d-e2af975bda6f/manager/0.log" Mar 18 16:38:11 crc kubenswrapper[4696]: I0318 16:38:11.351718 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-8467ccb4c8-p7snf_06cdd947-c4dd-4ccf-bb4b-fffef57443d4/manager/0.log" Mar 18 16:38:11 crc kubenswrapper[4696]: I0318 16:38:11.400356 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6d84559f47-x7vwf_a08cb64d-e133-4787-956b-4cef003ea78a/manager/0.log" Mar 18 16:38:11 crc kubenswrapper[4696]: I0318 16:38:11.552494 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-65fbdb4fdd-njrtk_caa2772a-b8a8-4d65-8b8d-19d9c03c62d6/manager/0.log" Mar 18 16:38:11 crc kubenswrapper[4696]: I0318 16:38:11.574835 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-74d6f7b5c-8hndt_fa515a71-3c55-46b7-bab2-60cef0a2b2e1/manager/0.log" Mar 18 16:38:12 crc kubenswrapper[4696]: I0318 16:38:12.193077 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:38:12 crc kubenswrapper[4696]: I0318 16:38:12.193131 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:38:12 crc kubenswrapper[4696]: I0318 16:38:12.193173 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 16:38:12 crc kubenswrapper[4696]: I0318 16:38:12.193890 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942"} pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:38:12 crc kubenswrapper[4696]: I0318 16:38:12.193938 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" containerID="cri-o://16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" gracePeriod=600 Mar 18 16:38:12 crc kubenswrapper[4696]: E0318 16:38:12.338344 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:38:12 crc kubenswrapper[4696]: I0318 16:38:12.500648 4696 generic.go:334] "Generic (PLEG): container finished" podID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" exitCode=0 Mar 18 16:38:12 crc kubenswrapper[4696]: I0318 16:38:12.500717 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerDied","Data":"16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942"} Mar 18 16:38:12 crc kubenswrapper[4696]: I0318 16:38:12.508886 4696 scope.go:117] "RemoveContainer" containerID="cf86dadce3d1fb257b7fe8784bc21654ea04954f310cfd6357d78c465ecfc986" Mar 18 16:38:12 crc kubenswrapper[4696]: I0318 16:38:12.509685 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:38:12 crc kubenswrapper[4696]: E0318 16:38:12.509989 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:38:12 crc kubenswrapper[4696]: I0318 16:38:12.814375 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5cfd84c587-g2jrg_ef87c345-1284-41dd-a5ae-57ae08c9558e/manager/0.log" Mar 18 16:38:25 crc kubenswrapper[4696]: I0318 16:38:25.656873 4696 scope.go:117] "RemoveContainer" containerID="2005ef9ef26f3cd38fa0714d0b9a8ef267ccb57ebc1cb89d0b7634de8e177985" Mar 18 16:38:27 crc kubenswrapper[4696]: I0318 16:38:27.603494 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:38:27 crc kubenswrapper[4696]: E0318 16:38:27.604114 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:38:31 crc kubenswrapper[4696]: I0318 16:38:31.324285 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-gx9sb_b0297eb3-0438-4db1-97bd-405779a01255/control-plane-machine-set-operator/0.log" Mar 18 16:38:31 crc kubenswrapper[4696]: I0318 16:38:31.474572 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-6xjgb_0c05edf1-5079-4212-ba5c-19621b2500cf/kube-rbac-proxy/0.log" Mar 18 16:38:31 crc kubenswrapper[4696]: I0318 16:38:31.480838 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-6xjgb_0c05edf1-5079-4212-ba5c-19621b2500cf/machine-api-operator/0.log" Mar 18 16:38:42 crc kubenswrapper[4696]: I0318 16:38:42.597261 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:38:42 crc kubenswrapper[4696]: E0318 16:38:42.599510 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:38:43 crc kubenswrapper[4696]: I0318 16:38:43.145388 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-lhsfv_e65694c1-c09d-4ab3-8032-640197b84e20/cert-manager-controller/0.log" Mar 18 16:38:43 crc kubenswrapper[4696]: I0318 16:38:43.365378 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-wmgb2_e4aa5b80-28c0-4b73-91ef-a0b1325d7823/cert-manager-cainjector/0.log" Mar 18 16:38:43 crc kubenswrapper[4696]: I0318 16:38:43.479482 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-7ncrl_ed08594c-854d-4c4d-8390-025916809f21/cert-manager-webhook/0.log" Mar 18 16:38:56 crc kubenswrapper[4696]: I0318 16:38:56.049921 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-pngnv_ea18cdb1-cb1f-46b3-af17-e834b51c6803/nmstate-console-plugin/0.log" Mar 18 16:38:56 crc kubenswrapper[4696]: I0318 16:38:56.176447 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-qtwr9_7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704/nmstate-handler/0.log" Mar 18 16:38:56 crc kubenswrapper[4696]: I0318 16:38:56.288941 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-jlrq4_90a85ff7-6f9a-40c4-b528-15f0c3739a2b/kube-rbac-proxy/0.log" Mar 18 16:38:56 crc kubenswrapper[4696]: I0318 16:38:56.314547 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-jlrq4_90a85ff7-6f9a-40c4-b528-15f0c3739a2b/nmstate-metrics/0.log" Mar 18 16:38:56 crc kubenswrapper[4696]: I0318 16:38:56.420964 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-f8s88_3c50fff4-65a2-49c1-997a-658bc72f1fe7/nmstate-operator/0.log" Mar 18 16:38:56 crc kubenswrapper[4696]: I0318 16:38:56.498031 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-pn2r6_d0117f34-5320-46c0-952f-54d4abacdce4/nmstate-webhook/0.log" Mar 18 16:38:56 crc kubenswrapper[4696]: I0318 16:38:56.597047 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:38:56 crc kubenswrapper[4696]: E0318 16:38:56.597460 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:39:07 crc kubenswrapper[4696]: I0318 16:39:07.606936 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:39:07 crc kubenswrapper[4696]: E0318 16:39:07.607739 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:39:22 crc kubenswrapper[4696]: I0318 16:39:22.142141 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-fnvx2_e1b08d64-c01c-4cb5-b1ee-8cfc03868c70/kube-rbac-proxy/0.log" Mar 18 16:39:22 crc kubenswrapper[4696]: I0318 16:39:22.271147 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-fnvx2_e1b08d64-c01c-4cb5-b1ee-8cfc03868c70/controller/0.log" Mar 18 16:39:22 crc kubenswrapper[4696]: I0318 16:39:22.378489 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/cp-frr-files/0.log" Mar 18 16:39:22 crc kubenswrapper[4696]: I0318 16:39:22.521988 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/cp-frr-files/0.log" Mar 18 16:39:22 crc kubenswrapper[4696]: I0318 16:39:22.538062 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/cp-metrics/0.log" Mar 18 16:39:22 crc kubenswrapper[4696]: I0318 16:39:22.538882 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/cp-reloader/0.log" Mar 18 16:39:22 crc kubenswrapper[4696]: I0318 16:39:22.597294 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:39:22 crc kubenswrapper[4696]: E0318 16:39:22.597850 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:39:22 crc kubenswrapper[4696]: I0318 16:39:22.598675 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/cp-reloader/0.log" Mar 18 16:39:22 crc kubenswrapper[4696]: I0318 16:39:22.777079 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/cp-frr-files/0.log" Mar 18 16:39:22 crc kubenswrapper[4696]: I0318 16:39:22.785730 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/cp-metrics/0.log" Mar 18 16:39:22 crc kubenswrapper[4696]: I0318 16:39:22.805018 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/cp-metrics/0.log" Mar 18 16:39:22 crc kubenswrapper[4696]: I0318 16:39:22.810560 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/cp-reloader/0.log" Mar 18 16:39:22 crc kubenswrapper[4696]: I0318 16:39:22.993818 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/cp-reloader/0.log" Mar 18 16:39:22 crc kubenswrapper[4696]: I0318 16:39:22.997779 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/cp-metrics/0.log" Mar 18 16:39:23 crc kubenswrapper[4696]: I0318 16:39:23.012360 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/cp-frr-files/0.log" Mar 18 16:39:23 crc kubenswrapper[4696]: I0318 16:39:23.031108 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/controller/0.log" Mar 18 16:39:23 crc kubenswrapper[4696]: I0318 16:39:23.205676 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/frr-metrics/0.log" Mar 18 16:39:23 crc kubenswrapper[4696]: I0318 16:39:23.239189 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/kube-rbac-proxy/0.log" Mar 18 16:39:23 crc kubenswrapper[4696]: I0318 16:39:23.264979 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/kube-rbac-proxy-frr/0.log" Mar 18 16:39:23 crc kubenswrapper[4696]: I0318 16:39:23.381321 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/reloader/0.log" Mar 18 16:39:23 crc kubenswrapper[4696]: I0318 16:39:23.971197 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-whtkt_58a00a45-b349-471d-816b-a05268da02e4/frr-k8s-webhook-server/0.log" Mar 18 16:39:24 crc kubenswrapper[4696]: I0318 16:39:24.168542 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-76ff64997f-7v6kl_d823fa6b-b1c9-4c8e-9da9-49e457c2fae6/manager/0.log" Mar 18 16:39:24 crc kubenswrapper[4696]: I0318 16:39:24.287983 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-c9479f99b-72fxd_b28709b3-7641-45cc-9e79-9be140d2bcae/webhook-server/0.log" Mar 18 16:39:24 crc kubenswrapper[4696]: I0318 16:39:24.441757 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pr9g2_5556d2f1-1113-45f4-89d4-deea421bb0aa/kube-rbac-proxy/0.log" Mar 18 16:39:24 crc kubenswrapper[4696]: I0318 16:39:24.816046 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/frr/0.log" Mar 18 16:39:24 crc kubenswrapper[4696]: I0318 16:39:24.923576 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pr9g2_5556d2f1-1113-45f4-89d4-deea421bb0aa/speaker/0.log" Mar 18 16:39:34 crc kubenswrapper[4696]: I0318 16:39:34.597348 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:39:34 crc kubenswrapper[4696]: E0318 16:39:34.598129 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:39:36 crc kubenswrapper[4696]: I0318 16:39:36.795587 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh_60e108b1-f55a-4685-8219-7be977826f05/util/0.log" Mar 18 16:39:37 crc kubenswrapper[4696]: I0318 16:39:37.024500 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh_60e108b1-f55a-4685-8219-7be977826f05/util/0.log" Mar 18 16:39:37 crc kubenswrapper[4696]: I0318 16:39:37.101882 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh_60e108b1-f55a-4685-8219-7be977826f05/pull/0.log" Mar 18 16:39:37 crc kubenswrapper[4696]: I0318 16:39:37.110821 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh_60e108b1-f55a-4685-8219-7be977826f05/pull/0.log" Mar 18 16:39:37 crc kubenswrapper[4696]: I0318 16:39:37.295063 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh_60e108b1-f55a-4685-8219-7be977826f05/extract/0.log" Mar 18 16:39:37 crc kubenswrapper[4696]: I0318 16:39:37.310006 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh_60e108b1-f55a-4685-8219-7be977826f05/util/0.log" Mar 18 16:39:37 crc kubenswrapper[4696]: I0318 16:39:37.319182 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh_60e108b1-f55a-4685-8219-7be977826f05/pull/0.log" Mar 18 16:39:37 crc kubenswrapper[4696]: I0318 16:39:37.471823 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft_70134d43-b7f8-4a91-864c-9e680a9e8ae9/util/0.log" Mar 18 16:39:37 crc kubenswrapper[4696]: I0318 16:39:37.652495 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft_70134d43-b7f8-4a91-864c-9e680a9e8ae9/pull/0.log" Mar 18 16:39:37 crc kubenswrapper[4696]: I0318 16:39:37.679554 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft_70134d43-b7f8-4a91-864c-9e680a9e8ae9/pull/0.log" Mar 18 16:39:37 crc kubenswrapper[4696]: I0318 16:39:37.684728 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft_70134d43-b7f8-4a91-864c-9e680a9e8ae9/util/0.log" Mar 18 16:39:37 crc kubenswrapper[4696]: I0318 16:39:37.842647 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft_70134d43-b7f8-4a91-864c-9e680a9e8ae9/util/0.log" Mar 18 16:39:37 crc kubenswrapper[4696]: I0318 16:39:37.843347 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft_70134d43-b7f8-4a91-864c-9e680a9e8ae9/pull/0.log" Mar 18 16:39:37 crc kubenswrapper[4696]: I0318 16:39:37.917984 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft_70134d43-b7f8-4a91-864c-9e680a9e8ae9/extract/0.log" Mar 18 16:39:38 crc kubenswrapper[4696]: I0318 16:39:38.055537 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mvwp5_c0f11748-034a-4f55-9da4-ee34ca565a33/extract-utilities/0.log" Mar 18 16:39:38 crc kubenswrapper[4696]: I0318 16:39:38.240445 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mvwp5_c0f11748-034a-4f55-9da4-ee34ca565a33/extract-content/0.log" Mar 18 16:39:38 crc kubenswrapper[4696]: I0318 16:39:38.282930 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mvwp5_c0f11748-034a-4f55-9da4-ee34ca565a33/extract-utilities/0.log" Mar 18 16:39:38 crc kubenswrapper[4696]: I0318 16:39:38.288286 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mvwp5_c0f11748-034a-4f55-9da4-ee34ca565a33/extract-content/0.log" Mar 18 16:39:38 crc kubenswrapper[4696]: I0318 16:39:38.425828 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mvwp5_c0f11748-034a-4f55-9da4-ee34ca565a33/extract-utilities/0.log" Mar 18 16:39:38 crc kubenswrapper[4696]: I0318 16:39:38.435406 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mvwp5_c0f11748-034a-4f55-9da4-ee34ca565a33/extract-content/0.log" Mar 18 16:39:38 crc kubenswrapper[4696]: I0318 16:39:38.611790 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dbfmt_c88d01e2-45ab-4111-9029-3f3e2c12c58e/extract-utilities/0.log" Mar 18 16:39:38 crc kubenswrapper[4696]: I0318 16:39:38.868453 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dbfmt_c88d01e2-45ab-4111-9029-3f3e2c12c58e/extract-utilities/0.log" Mar 18 16:39:38 crc kubenswrapper[4696]: I0318 16:39:38.882577 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dbfmt_c88d01e2-45ab-4111-9029-3f3e2c12c58e/extract-content/0.log" Mar 18 16:39:38 crc kubenswrapper[4696]: I0318 16:39:38.900721 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dbfmt_c88d01e2-45ab-4111-9029-3f3e2c12c58e/extract-content/0.log" Mar 18 16:39:39 crc kubenswrapper[4696]: I0318 16:39:39.100951 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dbfmt_c88d01e2-45ab-4111-9029-3f3e2c12c58e/extract-content/0.log" Mar 18 16:39:39 crc kubenswrapper[4696]: I0318 16:39:39.101670 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dbfmt_c88d01e2-45ab-4111-9029-3f3e2c12c58e/extract-utilities/0.log" Mar 18 16:39:39 crc kubenswrapper[4696]: I0318 16:39:39.204303 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mvwp5_c0f11748-034a-4f55-9da4-ee34ca565a33/registry-server/0.log" Mar 18 16:39:39 crc kubenswrapper[4696]: I0318 16:39:39.387300 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-v84rb_71a9fceb-5471-42f1-867d-28f7196daf81/marketplace-operator/0.log" Mar 18 16:39:39 crc kubenswrapper[4696]: I0318 16:39:39.607905 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g8lp7_a4c082b9-ab53-4ff0-94db-c714a1fc683e/extract-utilities/0.log" Mar 18 16:39:39 crc kubenswrapper[4696]: I0318 16:39:39.808188 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dbfmt_c88d01e2-45ab-4111-9029-3f3e2c12c58e/registry-server/0.log" Mar 18 16:39:39 crc kubenswrapper[4696]: I0318 16:39:39.842259 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g8lp7_a4c082b9-ab53-4ff0-94db-c714a1fc683e/extract-utilities/0.log" Mar 18 16:39:39 crc kubenswrapper[4696]: I0318 16:39:39.853769 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g8lp7_a4c082b9-ab53-4ff0-94db-c714a1fc683e/extract-content/0.log" Mar 18 16:39:39 crc kubenswrapper[4696]: I0318 16:39:39.899287 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g8lp7_a4c082b9-ab53-4ff0-94db-c714a1fc683e/extract-content/0.log" Mar 18 16:39:40 crc kubenswrapper[4696]: I0318 16:39:40.037259 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g8lp7_a4c082b9-ab53-4ff0-94db-c714a1fc683e/extract-content/0.log" Mar 18 16:39:40 crc kubenswrapper[4696]: I0318 16:39:40.037465 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g8lp7_a4c082b9-ab53-4ff0-94db-c714a1fc683e/extract-utilities/0.log" Mar 18 16:39:40 crc kubenswrapper[4696]: I0318 16:39:40.281824 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g8lp7_a4c082b9-ab53-4ff0-94db-c714a1fc683e/registry-server/0.log" Mar 18 16:39:40 crc kubenswrapper[4696]: I0318 16:39:40.294813 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sb4zx_7b73d2de-d188-4d58-84da-b09309189aa1/extract-utilities/0.log" Mar 18 16:39:40 crc kubenswrapper[4696]: I0318 16:39:40.433676 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sb4zx_7b73d2de-d188-4d58-84da-b09309189aa1/extract-content/0.log" Mar 18 16:39:40 crc kubenswrapper[4696]: I0318 16:39:40.457839 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sb4zx_7b73d2de-d188-4d58-84da-b09309189aa1/extract-content/0.log" Mar 18 16:39:40 crc kubenswrapper[4696]: I0318 16:39:40.458080 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sb4zx_7b73d2de-d188-4d58-84da-b09309189aa1/extract-utilities/0.log" Mar 18 16:39:40 crc kubenswrapper[4696]: I0318 16:39:40.597332 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sb4zx_7b73d2de-d188-4d58-84da-b09309189aa1/extract-utilities/0.log" Mar 18 16:39:40 crc kubenswrapper[4696]: I0318 16:39:40.635912 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sb4zx_7b73d2de-d188-4d58-84da-b09309189aa1/extract-content/0.log" Mar 18 16:39:40 crc kubenswrapper[4696]: I0318 16:39:40.762740 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sb4zx_7b73d2de-d188-4d58-84da-b09309189aa1/registry-server/0.log" Mar 18 16:39:49 crc kubenswrapper[4696]: I0318 16:39:49.599371 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:39:49 crc kubenswrapper[4696]: E0318 16:39:49.600217 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:40:00 crc kubenswrapper[4696]: I0318 16:40:00.152444 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564200-bdfr4"] Mar 18 16:40:00 crc kubenswrapper[4696]: E0318 16:40:00.153229 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c1dc242-7619-477c-8bbf-c5b6cc2c7805" containerName="oc" Mar 18 16:40:00 crc kubenswrapper[4696]: I0318 16:40:00.153241 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c1dc242-7619-477c-8bbf-c5b6cc2c7805" containerName="oc" Mar 18 16:40:00 crc kubenswrapper[4696]: I0318 16:40:00.153433 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c1dc242-7619-477c-8bbf-c5b6cc2c7805" containerName="oc" Mar 18 16:40:00 crc kubenswrapper[4696]: I0318 16:40:00.154084 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564200-bdfr4" Mar 18 16:40:00 crc kubenswrapper[4696]: I0318 16:40:00.158881 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:40:00 crc kubenswrapper[4696]: I0318 16:40:00.159029 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:40:00 crc kubenswrapper[4696]: I0318 16:40:00.159295 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:40:00 crc kubenswrapper[4696]: I0318 16:40:00.169758 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564200-bdfr4"] Mar 18 16:40:00 crc kubenswrapper[4696]: I0318 16:40:00.229777 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwpfc\" (UniqueName: \"kubernetes.io/projected/87cfc7b5-4df3-493e-922e-d4a75ba9798b-kube-api-access-wwpfc\") pod \"auto-csr-approver-29564200-bdfr4\" (UID: \"87cfc7b5-4df3-493e-922e-d4a75ba9798b\") " pod="openshift-infra/auto-csr-approver-29564200-bdfr4" Mar 18 16:40:00 crc kubenswrapper[4696]: I0318 16:40:00.331783 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwpfc\" (UniqueName: \"kubernetes.io/projected/87cfc7b5-4df3-493e-922e-d4a75ba9798b-kube-api-access-wwpfc\") pod \"auto-csr-approver-29564200-bdfr4\" (UID: \"87cfc7b5-4df3-493e-922e-d4a75ba9798b\") " pod="openshift-infra/auto-csr-approver-29564200-bdfr4" Mar 18 16:40:00 crc kubenswrapper[4696]: I0318 16:40:00.350187 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwpfc\" (UniqueName: \"kubernetes.io/projected/87cfc7b5-4df3-493e-922e-d4a75ba9798b-kube-api-access-wwpfc\") pod \"auto-csr-approver-29564200-bdfr4\" (UID: \"87cfc7b5-4df3-493e-922e-d4a75ba9798b\") " pod="openshift-infra/auto-csr-approver-29564200-bdfr4" Mar 18 16:40:00 crc kubenswrapper[4696]: I0318 16:40:00.473938 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564200-bdfr4" Mar 18 16:40:00 crc kubenswrapper[4696]: I0318 16:40:00.598611 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:40:00 crc kubenswrapper[4696]: E0318 16:40:00.599142 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:40:01 crc kubenswrapper[4696]: I0318 16:40:01.024012 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564200-bdfr4"] Mar 18 16:40:01 crc kubenswrapper[4696]: W0318 16:40:01.066048 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87cfc7b5_4df3_493e_922e_d4a75ba9798b.slice/crio-34b5098f53757194c24e6b7d18ca0d343f2cc1e419ede8989431cd1269d4e814 WatchSource:0}: Error finding container 34b5098f53757194c24e6b7d18ca0d343f2cc1e419ede8989431cd1269d4e814: Status 404 returned error can't find the container with id 34b5098f53757194c24e6b7d18ca0d343f2cc1e419ede8989431cd1269d4e814 Mar 18 16:40:01 crc kubenswrapper[4696]: I0318 16:40:01.419118 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564200-bdfr4" event={"ID":"87cfc7b5-4df3-493e-922e-d4a75ba9798b","Type":"ContainerStarted","Data":"34b5098f53757194c24e6b7d18ca0d343f2cc1e419ede8989431cd1269d4e814"} Mar 18 16:40:04 crc kubenswrapper[4696]: I0318 16:40:04.446193 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564200-bdfr4" event={"ID":"87cfc7b5-4df3-493e-922e-d4a75ba9798b","Type":"ContainerStarted","Data":"50f946607503ec571a0fc009a4fe5c8e330237312d9f5dafdc6cfedcf2d64ea6"} Mar 18 16:40:05 crc kubenswrapper[4696]: I0318 16:40:05.456420 4696 generic.go:334] "Generic (PLEG): container finished" podID="87cfc7b5-4df3-493e-922e-d4a75ba9798b" containerID="50f946607503ec571a0fc009a4fe5c8e330237312d9f5dafdc6cfedcf2d64ea6" exitCode=0 Mar 18 16:40:05 crc kubenswrapper[4696]: I0318 16:40:05.456905 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564200-bdfr4" event={"ID":"87cfc7b5-4df3-493e-922e-d4a75ba9798b","Type":"ContainerDied","Data":"50f946607503ec571a0fc009a4fe5c8e330237312d9f5dafdc6cfedcf2d64ea6"} Mar 18 16:40:07 crc kubenswrapper[4696]: I0318 16:40:07.015669 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564200-bdfr4" Mar 18 16:40:07 crc kubenswrapper[4696]: I0318 16:40:07.092945 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwpfc\" (UniqueName: \"kubernetes.io/projected/87cfc7b5-4df3-493e-922e-d4a75ba9798b-kube-api-access-wwpfc\") pod \"87cfc7b5-4df3-493e-922e-d4a75ba9798b\" (UID: \"87cfc7b5-4df3-493e-922e-d4a75ba9798b\") " Mar 18 16:40:07 crc kubenswrapper[4696]: I0318 16:40:07.101398 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cfc7b5-4df3-493e-922e-d4a75ba9798b-kube-api-access-wwpfc" (OuterVolumeSpecName: "kube-api-access-wwpfc") pod "87cfc7b5-4df3-493e-922e-d4a75ba9798b" (UID: "87cfc7b5-4df3-493e-922e-d4a75ba9798b"). InnerVolumeSpecName "kube-api-access-wwpfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:40:07 crc kubenswrapper[4696]: I0318 16:40:07.195635 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwpfc\" (UniqueName: \"kubernetes.io/projected/87cfc7b5-4df3-493e-922e-d4a75ba9798b-kube-api-access-wwpfc\") on node \"crc\" DevicePath \"\"" Mar 18 16:40:07 crc kubenswrapper[4696]: I0318 16:40:07.479933 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564200-bdfr4" event={"ID":"87cfc7b5-4df3-493e-922e-d4a75ba9798b","Type":"ContainerDied","Data":"34b5098f53757194c24e6b7d18ca0d343f2cc1e419ede8989431cd1269d4e814"} Mar 18 16:40:07 crc kubenswrapper[4696]: I0318 16:40:07.479976 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34b5098f53757194c24e6b7d18ca0d343f2cc1e419ede8989431cd1269d4e814" Mar 18 16:40:07 crc kubenswrapper[4696]: I0318 16:40:07.480705 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564200-bdfr4" Mar 18 16:40:07 crc kubenswrapper[4696]: I0318 16:40:07.553254 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564194-662zt"] Mar 18 16:40:07 crc kubenswrapper[4696]: I0318 16:40:07.580783 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564194-662zt"] Mar 18 16:40:07 crc kubenswrapper[4696]: I0318 16:40:07.608857 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fa66a8b-5bc8-4701-8d68-2270067943a6" path="/var/lib/kubelet/pods/5fa66a8b-5bc8-4701-8d68-2270067943a6/volumes" Mar 18 16:40:11 crc kubenswrapper[4696]: I0318 16:40:11.601357 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:40:11 crc kubenswrapper[4696]: E0318 16:40:11.602398 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:40:25 crc kubenswrapper[4696]: I0318 16:40:25.598100 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:40:25 crc kubenswrapper[4696]: E0318 16:40:25.599705 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:40:25 crc kubenswrapper[4696]: I0318 16:40:25.763340 4696 scope.go:117] "RemoveContainer" containerID="ac5cd905e7460a32d079207deaef6bf06148fb1871f95dac27f9c7f3598d56d5" Mar 18 16:40:36 crc kubenswrapper[4696]: I0318 16:40:36.597366 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:40:36 crc kubenswrapper[4696]: E0318 16:40:36.598367 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:40:51 crc kubenswrapper[4696]: I0318 16:40:51.598647 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:40:51 crc kubenswrapper[4696]: E0318 16:40:51.599576 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:41:03 crc kubenswrapper[4696]: I0318 16:41:03.602908 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:41:03 crc kubenswrapper[4696]: E0318 16:41:03.603601 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:41:16 crc kubenswrapper[4696]: I0318 16:41:16.597814 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:41:16 crc kubenswrapper[4696]: E0318 16:41:16.598742 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:41:29 crc kubenswrapper[4696]: I0318 16:41:29.598085 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:41:29 crc kubenswrapper[4696]: E0318 16:41:29.598802 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:41:31 crc kubenswrapper[4696]: I0318 16:41:31.382176 4696 generic.go:334] "Generic (PLEG): container finished" podID="7293f441-53f0-4872-9d79-c1198766aa86" containerID="b2d87c0a02b40a5666781ec6a05a06e5158b12645914a14bf1e86cc5580a5f7f" exitCode=0 Mar 18 16:41:31 crc kubenswrapper[4696]: I0318 16:41:31.382246 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ng8w4/must-gather-9rxkb" event={"ID":"7293f441-53f0-4872-9d79-c1198766aa86","Type":"ContainerDied","Data":"b2d87c0a02b40a5666781ec6a05a06e5158b12645914a14bf1e86cc5580a5f7f"} Mar 18 16:41:31 crc kubenswrapper[4696]: I0318 16:41:31.383170 4696 scope.go:117] "RemoveContainer" containerID="b2d87c0a02b40a5666781ec6a05a06e5158b12645914a14bf1e86cc5580a5f7f" Mar 18 16:41:32 crc kubenswrapper[4696]: I0318 16:41:32.025403 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ng8w4_must-gather-9rxkb_7293f441-53f0-4872-9d79-c1198766aa86/gather/0.log" Mar 18 16:41:34 crc kubenswrapper[4696]: E0318 16:41:34.179873 4696 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.192:35252->38.102.83.192:37641: write tcp 38.102.83.192:35252->38.102.83.192:37641: write: broken pipe Mar 18 16:41:34 crc kubenswrapper[4696]: I0318 16:41:34.826116 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k6hhx"] Mar 18 16:41:34 crc kubenswrapper[4696]: E0318 16:41:34.827268 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87cfc7b5-4df3-493e-922e-d4a75ba9798b" containerName="oc" Mar 18 16:41:34 crc kubenswrapper[4696]: I0318 16:41:34.827295 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="87cfc7b5-4df3-493e-922e-d4a75ba9798b" containerName="oc" Mar 18 16:41:34 crc kubenswrapper[4696]: I0318 16:41:34.827546 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="87cfc7b5-4df3-493e-922e-d4a75ba9798b" containerName="oc" Mar 18 16:41:34 crc kubenswrapper[4696]: I0318 16:41:34.828811 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k6hhx" Mar 18 16:41:34 crc kubenswrapper[4696]: I0318 16:41:34.840024 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k6hhx"] Mar 18 16:41:34 crc kubenswrapper[4696]: I0318 16:41:34.899431 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82229e9c-40c8-406c-a4a7-5f5a316f61d0-catalog-content\") pod \"redhat-marketplace-k6hhx\" (UID: \"82229e9c-40c8-406c-a4a7-5f5a316f61d0\") " pod="openshift-marketplace/redhat-marketplace-k6hhx" Mar 18 16:41:34 crc kubenswrapper[4696]: I0318 16:41:34.899865 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pcrl\" (UniqueName: \"kubernetes.io/projected/82229e9c-40c8-406c-a4a7-5f5a316f61d0-kube-api-access-6pcrl\") pod \"redhat-marketplace-k6hhx\" (UID: \"82229e9c-40c8-406c-a4a7-5f5a316f61d0\") " pod="openshift-marketplace/redhat-marketplace-k6hhx" Mar 18 16:41:34 crc kubenswrapper[4696]: I0318 16:41:34.899908 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82229e9c-40c8-406c-a4a7-5f5a316f61d0-utilities\") pod \"redhat-marketplace-k6hhx\" (UID: \"82229e9c-40c8-406c-a4a7-5f5a316f61d0\") " pod="openshift-marketplace/redhat-marketplace-k6hhx" Mar 18 16:41:35 crc kubenswrapper[4696]: I0318 16:41:35.002155 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pcrl\" (UniqueName: \"kubernetes.io/projected/82229e9c-40c8-406c-a4a7-5f5a316f61d0-kube-api-access-6pcrl\") pod \"redhat-marketplace-k6hhx\" (UID: \"82229e9c-40c8-406c-a4a7-5f5a316f61d0\") " pod="openshift-marketplace/redhat-marketplace-k6hhx" Mar 18 16:41:35 crc kubenswrapper[4696]: I0318 16:41:35.002227 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82229e9c-40c8-406c-a4a7-5f5a316f61d0-utilities\") pod \"redhat-marketplace-k6hhx\" (UID: \"82229e9c-40c8-406c-a4a7-5f5a316f61d0\") " pod="openshift-marketplace/redhat-marketplace-k6hhx" Mar 18 16:41:35 crc kubenswrapper[4696]: I0318 16:41:35.002312 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82229e9c-40c8-406c-a4a7-5f5a316f61d0-catalog-content\") pod \"redhat-marketplace-k6hhx\" (UID: \"82229e9c-40c8-406c-a4a7-5f5a316f61d0\") " pod="openshift-marketplace/redhat-marketplace-k6hhx" Mar 18 16:41:35 crc kubenswrapper[4696]: I0318 16:41:35.002888 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82229e9c-40c8-406c-a4a7-5f5a316f61d0-catalog-content\") pod \"redhat-marketplace-k6hhx\" (UID: \"82229e9c-40c8-406c-a4a7-5f5a316f61d0\") " pod="openshift-marketplace/redhat-marketplace-k6hhx" Mar 18 16:41:35 crc kubenswrapper[4696]: I0318 16:41:35.002934 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82229e9c-40c8-406c-a4a7-5f5a316f61d0-utilities\") pod \"redhat-marketplace-k6hhx\" (UID: \"82229e9c-40c8-406c-a4a7-5f5a316f61d0\") " pod="openshift-marketplace/redhat-marketplace-k6hhx" Mar 18 16:41:35 crc kubenswrapper[4696]: I0318 16:41:35.028507 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pcrl\" (UniqueName: \"kubernetes.io/projected/82229e9c-40c8-406c-a4a7-5f5a316f61d0-kube-api-access-6pcrl\") pod \"redhat-marketplace-k6hhx\" (UID: \"82229e9c-40c8-406c-a4a7-5f5a316f61d0\") " pod="openshift-marketplace/redhat-marketplace-k6hhx" Mar 18 16:41:35 crc kubenswrapper[4696]: I0318 16:41:35.155672 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k6hhx" Mar 18 16:41:35 crc kubenswrapper[4696]: I0318 16:41:35.656376 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k6hhx"] Mar 18 16:41:36 crc kubenswrapper[4696]: I0318 16:41:36.425913 4696 generic.go:334] "Generic (PLEG): container finished" podID="82229e9c-40c8-406c-a4a7-5f5a316f61d0" containerID="48d10f7addb3210552df06988d4300dcac910f97d8e7b5c68ff071a2d7eb9746" exitCode=0 Mar 18 16:41:36 crc kubenswrapper[4696]: I0318 16:41:36.425979 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6hhx" event={"ID":"82229e9c-40c8-406c-a4a7-5f5a316f61d0","Type":"ContainerDied","Data":"48d10f7addb3210552df06988d4300dcac910f97d8e7b5c68ff071a2d7eb9746"} Mar 18 16:41:36 crc kubenswrapper[4696]: I0318 16:41:36.426493 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6hhx" event={"ID":"82229e9c-40c8-406c-a4a7-5f5a316f61d0","Type":"ContainerStarted","Data":"eb381efbb55cc6fdcd9f69f4814710146cd2e976b09c5df6ea995f975e3e8b0a"} Mar 18 16:41:36 crc kubenswrapper[4696]: I0318 16:41:36.427821 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:41:38 crc kubenswrapper[4696]: I0318 16:41:38.451613 4696 generic.go:334] "Generic (PLEG): container finished" podID="82229e9c-40c8-406c-a4a7-5f5a316f61d0" containerID="1a59d6323efa927ba55b9e7c26fe29812ebf8919af712264f8fd1f977cdad422" exitCode=0 Mar 18 16:41:38 crc kubenswrapper[4696]: I0318 16:41:38.451834 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6hhx" event={"ID":"82229e9c-40c8-406c-a4a7-5f5a316f61d0","Type":"ContainerDied","Data":"1a59d6323efa927ba55b9e7c26fe29812ebf8919af712264f8fd1f977cdad422"} Mar 18 16:41:39 crc kubenswrapper[4696]: I0318 16:41:39.462850 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6hhx" event={"ID":"82229e9c-40c8-406c-a4a7-5f5a316f61d0","Type":"ContainerStarted","Data":"7ce7f2d62438a9ab6e88d91378ec6ef196190f3ab9963cdc03c8a8eca11d6f3f"} Mar 18 16:41:39 crc kubenswrapper[4696]: I0318 16:41:39.487630 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k6hhx" podStartSLOduration=2.838683416 podStartE2EDuration="5.487610801s" podCreationTimestamp="2026-03-18 16:41:34 +0000 UTC" firstStartedPulling="2026-03-18 16:41:36.427576507 +0000 UTC m=+3939.433750713" lastFinishedPulling="2026-03-18 16:41:39.076503902 +0000 UTC m=+3942.082678098" observedRunningTime="2026-03-18 16:41:39.481630111 +0000 UTC m=+3942.487804337" watchObservedRunningTime="2026-03-18 16:41:39.487610801 +0000 UTC m=+3942.493785007" Mar 18 16:41:40 crc kubenswrapper[4696]: I0318 16:41:40.280496 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ng8w4/must-gather-9rxkb"] Mar 18 16:41:40 crc kubenswrapper[4696]: I0318 16:41:40.281194 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-ng8w4/must-gather-9rxkb" podUID="7293f441-53f0-4872-9d79-c1198766aa86" containerName="copy" containerID="cri-o://d86441c8d99f7bb010e60a89e59299ba9d4fcb66f33de1d5276f2f3194814404" gracePeriod=2 Mar 18 16:41:40 crc kubenswrapper[4696]: I0318 16:41:40.291696 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ng8w4/must-gather-9rxkb"] Mar 18 16:41:40 crc kubenswrapper[4696]: I0318 16:41:40.483427 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ng8w4_must-gather-9rxkb_7293f441-53f0-4872-9d79-c1198766aa86/copy/0.log" Mar 18 16:41:40 crc kubenswrapper[4696]: I0318 16:41:40.483987 4696 generic.go:334] "Generic (PLEG): container finished" podID="7293f441-53f0-4872-9d79-c1198766aa86" containerID="d86441c8d99f7bb010e60a89e59299ba9d4fcb66f33de1d5276f2f3194814404" exitCode=143 Mar 18 16:41:40 crc kubenswrapper[4696]: I0318 16:41:40.753013 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ng8w4_must-gather-9rxkb_7293f441-53f0-4872-9d79-c1198766aa86/copy/0.log" Mar 18 16:41:40 crc kubenswrapper[4696]: I0318 16:41:40.753783 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ng8w4/must-gather-9rxkb" Mar 18 16:41:40 crc kubenswrapper[4696]: I0318 16:41:40.918675 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7293f441-53f0-4872-9d79-c1198766aa86-must-gather-output\") pod \"7293f441-53f0-4872-9d79-c1198766aa86\" (UID: \"7293f441-53f0-4872-9d79-c1198766aa86\") " Mar 18 16:41:40 crc kubenswrapper[4696]: I0318 16:41:40.918858 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljrhx\" (UniqueName: \"kubernetes.io/projected/7293f441-53f0-4872-9d79-c1198766aa86-kube-api-access-ljrhx\") pod \"7293f441-53f0-4872-9d79-c1198766aa86\" (UID: \"7293f441-53f0-4872-9d79-c1198766aa86\") " Mar 18 16:41:40 crc kubenswrapper[4696]: I0318 16:41:40.923675 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7293f441-53f0-4872-9d79-c1198766aa86-kube-api-access-ljrhx" (OuterVolumeSpecName: "kube-api-access-ljrhx") pod "7293f441-53f0-4872-9d79-c1198766aa86" (UID: "7293f441-53f0-4872-9d79-c1198766aa86"). InnerVolumeSpecName "kube-api-access-ljrhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:41:41 crc kubenswrapper[4696]: I0318 16:41:41.025341 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljrhx\" (UniqueName: \"kubernetes.io/projected/7293f441-53f0-4872-9d79-c1198766aa86-kube-api-access-ljrhx\") on node \"crc\" DevicePath \"\"" Mar 18 16:41:41 crc kubenswrapper[4696]: I0318 16:41:41.065800 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7293f441-53f0-4872-9d79-c1198766aa86-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "7293f441-53f0-4872-9d79-c1198766aa86" (UID: "7293f441-53f0-4872-9d79-c1198766aa86"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:41:41 crc kubenswrapper[4696]: I0318 16:41:41.128120 4696 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7293f441-53f0-4872-9d79-c1198766aa86-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 18 16:41:41 crc kubenswrapper[4696]: I0318 16:41:41.495912 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ng8w4_must-gather-9rxkb_7293f441-53f0-4872-9d79-c1198766aa86/copy/0.log" Mar 18 16:41:41 crc kubenswrapper[4696]: I0318 16:41:41.496563 4696 scope.go:117] "RemoveContainer" containerID="d86441c8d99f7bb010e60a89e59299ba9d4fcb66f33de1d5276f2f3194814404" Mar 18 16:41:41 crc kubenswrapper[4696]: I0318 16:41:41.496676 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ng8w4/must-gather-9rxkb" Mar 18 16:41:41 crc kubenswrapper[4696]: I0318 16:41:41.529576 4696 scope.go:117] "RemoveContainer" containerID="b2d87c0a02b40a5666781ec6a05a06e5158b12645914a14bf1e86cc5580a5f7f" Mar 18 16:41:41 crc kubenswrapper[4696]: I0318 16:41:41.597658 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:41:41 crc kubenswrapper[4696]: E0318 16:41:41.598087 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:41:41 crc kubenswrapper[4696]: I0318 16:41:41.618058 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7293f441-53f0-4872-9d79-c1198766aa86" path="/var/lib/kubelet/pods/7293f441-53f0-4872-9d79-c1198766aa86/volumes" Mar 18 16:41:45 crc kubenswrapper[4696]: I0318 16:41:45.156270 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k6hhx" Mar 18 16:41:45 crc kubenswrapper[4696]: I0318 16:41:45.156867 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k6hhx" Mar 18 16:41:45 crc kubenswrapper[4696]: I0318 16:41:45.201559 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k6hhx" Mar 18 16:41:45 crc kubenswrapper[4696]: I0318 16:41:45.575686 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k6hhx" Mar 18 16:41:45 crc kubenswrapper[4696]: I0318 16:41:45.621763 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k6hhx"] Mar 18 16:41:47 crc kubenswrapper[4696]: I0318 16:41:47.549684 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k6hhx" podUID="82229e9c-40c8-406c-a4a7-5f5a316f61d0" containerName="registry-server" containerID="cri-o://7ce7f2d62438a9ab6e88d91378ec6ef196190f3ab9963cdc03c8a8eca11d6f3f" gracePeriod=2 Mar 18 16:41:47 crc kubenswrapper[4696]: I0318 16:41:47.963288 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k6hhx" Mar 18 16:41:48 crc kubenswrapper[4696]: I0318 16:41:48.057324 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82229e9c-40c8-406c-a4a7-5f5a316f61d0-utilities\") pod \"82229e9c-40c8-406c-a4a7-5f5a316f61d0\" (UID: \"82229e9c-40c8-406c-a4a7-5f5a316f61d0\") " Mar 18 16:41:48 crc kubenswrapper[4696]: I0318 16:41:48.057597 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82229e9c-40c8-406c-a4a7-5f5a316f61d0-catalog-content\") pod \"82229e9c-40c8-406c-a4a7-5f5a316f61d0\" (UID: \"82229e9c-40c8-406c-a4a7-5f5a316f61d0\") " Mar 18 16:41:48 crc kubenswrapper[4696]: I0318 16:41:48.057680 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pcrl\" (UniqueName: \"kubernetes.io/projected/82229e9c-40c8-406c-a4a7-5f5a316f61d0-kube-api-access-6pcrl\") pod \"82229e9c-40c8-406c-a4a7-5f5a316f61d0\" (UID: \"82229e9c-40c8-406c-a4a7-5f5a316f61d0\") " Mar 18 16:41:48 crc kubenswrapper[4696]: I0318 16:41:48.058930 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82229e9c-40c8-406c-a4a7-5f5a316f61d0-utilities" (OuterVolumeSpecName: "utilities") pod "82229e9c-40c8-406c-a4a7-5f5a316f61d0" (UID: "82229e9c-40c8-406c-a4a7-5f5a316f61d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:41:48 crc kubenswrapper[4696]: I0318 16:41:48.063580 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82229e9c-40c8-406c-a4a7-5f5a316f61d0-kube-api-access-6pcrl" (OuterVolumeSpecName: "kube-api-access-6pcrl") pod "82229e9c-40c8-406c-a4a7-5f5a316f61d0" (UID: "82229e9c-40c8-406c-a4a7-5f5a316f61d0"). InnerVolumeSpecName "kube-api-access-6pcrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:41:48 crc kubenswrapper[4696]: I0318 16:41:48.141641 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82229e9c-40c8-406c-a4a7-5f5a316f61d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "82229e9c-40c8-406c-a4a7-5f5a316f61d0" (UID: "82229e9c-40c8-406c-a4a7-5f5a316f61d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:41:48 crc kubenswrapper[4696]: I0318 16:41:48.160026 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82229e9c-40c8-406c-a4a7-5f5a316f61d0-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:41:48 crc kubenswrapper[4696]: I0318 16:41:48.160065 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82229e9c-40c8-406c-a4a7-5f5a316f61d0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:41:48 crc kubenswrapper[4696]: I0318 16:41:48.160075 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pcrl\" (UniqueName: \"kubernetes.io/projected/82229e9c-40c8-406c-a4a7-5f5a316f61d0-kube-api-access-6pcrl\") on node \"crc\" DevicePath \"\"" Mar 18 16:41:48 crc kubenswrapper[4696]: I0318 16:41:48.574852 4696 generic.go:334] "Generic (PLEG): container finished" podID="82229e9c-40c8-406c-a4a7-5f5a316f61d0" containerID="7ce7f2d62438a9ab6e88d91378ec6ef196190f3ab9963cdc03c8a8eca11d6f3f" exitCode=0 Mar 18 16:41:48 crc kubenswrapper[4696]: I0318 16:41:48.574896 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6hhx" event={"ID":"82229e9c-40c8-406c-a4a7-5f5a316f61d0","Type":"ContainerDied","Data":"7ce7f2d62438a9ab6e88d91378ec6ef196190f3ab9963cdc03c8a8eca11d6f3f"} Mar 18 16:41:48 crc kubenswrapper[4696]: I0318 16:41:48.574933 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6hhx" event={"ID":"82229e9c-40c8-406c-a4a7-5f5a316f61d0","Type":"ContainerDied","Data":"eb381efbb55cc6fdcd9f69f4814710146cd2e976b09c5df6ea995f975e3e8b0a"} Mar 18 16:41:48 crc kubenswrapper[4696]: I0318 16:41:48.574929 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k6hhx" Mar 18 16:41:48 crc kubenswrapper[4696]: I0318 16:41:48.574952 4696 scope.go:117] "RemoveContainer" containerID="7ce7f2d62438a9ab6e88d91378ec6ef196190f3ab9963cdc03c8a8eca11d6f3f" Mar 18 16:41:48 crc kubenswrapper[4696]: I0318 16:41:48.615543 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k6hhx"] Mar 18 16:41:48 crc kubenswrapper[4696]: I0318 16:41:48.617189 4696 scope.go:117] "RemoveContainer" containerID="1a59d6323efa927ba55b9e7c26fe29812ebf8919af712264f8fd1f977cdad422" Mar 18 16:41:48 crc kubenswrapper[4696]: I0318 16:41:48.624281 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k6hhx"] Mar 18 16:41:48 crc kubenswrapper[4696]: I0318 16:41:48.646628 4696 scope.go:117] "RemoveContainer" containerID="48d10f7addb3210552df06988d4300dcac910f97d8e7b5c68ff071a2d7eb9746" Mar 18 16:41:48 crc kubenswrapper[4696]: I0318 16:41:48.687131 4696 scope.go:117] "RemoveContainer" containerID="7ce7f2d62438a9ab6e88d91378ec6ef196190f3ab9963cdc03c8a8eca11d6f3f" Mar 18 16:41:48 crc kubenswrapper[4696]: E0318 16:41:48.687861 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ce7f2d62438a9ab6e88d91378ec6ef196190f3ab9963cdc03c8a8eca11d6f3f\": container with ID starting with 7ce7f2d62438a9ab6e88d91378ec6ef196190f3ab9963cdc03c8a8eca11d6f3f not found: ID does not exist" containerID="7ce7f2d62438a9ab6e88d91378ec6ef196190f3ab9963cdc03c8a8eca11d6f3f" Mar 18 16:41:48 crc kubenswrapper[4696]: I0318 16:41:48.687915 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ce7f2d62438a9ab6e88d91378ec6ef196190f3ab9963cdc03c8a8eca11d6f3f"} err="failed to get container status \"7ce7f2d62438a9ab6e88d91378ec6ef196190f3ab9963cdc03c8a8eca11d6f3f\": rpc error: code = NotFound desc = could not find container \"7ce7f2d62438a9ab6e88d91378ec6ef196190f3ab9963cdc03c8a8eca11d6f3f\": container with ID starting with 7ce7f2d62438a9ab6e88d91378ec6ef196190f3ab9963cdc03c8a8eca11d6f3f not found: ID does not exist" Mar 18 16:41:48 crc kubenswrapper[4696]: I0318 16:41:48.687942 4696 scope.go:117] "RemoveContainer" containerID="1a59d6323efa927ba55b9e7c26fe29812ebf8919af712264f8fd1f977cdad422" Mar 18 16:41:48 crc kubenswrapper[4696]: E0318 16:41:48.688576 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a59d6323efa927ba55b9e7c26fe29812ebf8919af712264f8fd1f977cdad422\": container with ID starting with 1a59d6323efa927ba55b9e7c26fe29812ebf8919af712264f8fd1f977cdad422 not found: ID does not exist" containerID="1a59d6323efa927ba55b9e7c26fe29812ebf8919af712264f8fd1f977cdad422" Mar 18 16:41:48 crc kubenswrapper[4696]: I0318 16:41:48.688603 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a59d6323efa927ba55b9e7c26fe29812ebf8919af712264f8fd1f977cdad422"} err="failed to get container status \"1a59d6323efa927ba55b9e7c26fe29812ebf8919af712264f8fd1f977cdad422\": rpc error: code = NotFound desc = could not find container \"1a59d6323efa927ba55b9e7c26fe29812ebf8919af712264f8fd1f977cdad422\": container with ID starting with 1a59d6323efa927ba55b9e7c26fe29812ebf8919af712264f8fd1f977cdad422 not found: ID does not exist" Mar 18 16:41:48 crc kubenswrapper[4696]: I0318 16:41:48.688616 4696 scope.go:117] "RemoveContainer" containerID="48d10f7addb3210552df06988d4300dcac910f97d8e7b5c68ff071a2d7eb9746" Mar 18 16:41:48 crc kubenswrapper[4696]: E0318 16:41:48.688914 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48d10f7addb3210552df06988d4300dcac910f97d8e7b5c68ff071a2d7eb9746\": container with ID starting with 48d10f7addb3210552df06988d4300dcac910f97d8e7b5c68ff071a2d7eb9746 not found: ID does not exist" containerID="48d10f7addb3210552df06988d4300dcac910f97d8e7b5c68ff071a2d7eb9746" Mar 18 16:41:48 crc kubenswrapper[4696]: I0318 16:41:48.688953 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48d10f7addb3210552df06988d4300dcac910f97d8e7b5c68ff071a2d7eb9746"} err="failed to get container status \"48d10f7addb3210552df06988d4300dcac910f97d8e7b5c68ff071a2d7eb9746\": rpc error: code = NotFound desc = could not find container \"48d10f7addb3210552df06988d4300dcac910f97d8e7b5c68ff071a2d7eb9746\": container with ID starting with 48d10f7addb3210552df06988d4300dcac910f97d8e7b5c68ff071a2d7eb9746 not found: ID does not exist" Mar 18 16:41:49 crc kubenswrapper[4696]: I0318 16:41:49.612755 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82229e9c-40c8-406c-a4a7-5f5a316f61d0" path="/var/lib/kubelet/pods/82229e9c-40c8-406c-a4a7-5f5a316f61d0/volumes" Mar 18 16:41:52 crc kubenswrapper[4696]: I0318 16:41:52.597225 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:41:52 crc kubenswrapper[4696]: E0318 16:41:52.597932 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:42:00 crc kubenswrapper[4696]: I0318 16:42:00.176539 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564202-p2zwh"] Mar 18 16:42:00 crc kubenswrapper[4696]: E0318 16:42:00.177559 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82229e9c-40c8-406c-a4a7-5f5a316f61d0" containerName="extract-content" Mar 18 16:42:00 crc kubenswrapper[4696]: I0318 16:42:00.177582 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="82229e9c-40c8-406c-a4a7-5f5a316f61d0" containerName="extract-content" Mar 18 16:42:00 crc kubenswrapper[4696]: E0318 16:42:00.177626 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7293f441-53f0-4872-9d79-c1198766aa86" containerName="copy" Mar 18 16:42:00 crc kubenswrapper[4696]: I0318 16:42:00.177635 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="7293f441-53f0-4872-9d79-c1198766aa86" containerName="copy" Mar 18 16:42:00 crc kubenswrapper[4696]: E0318 16:42:00.177660 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82229e9c-40c8-406c-a4a7-5f5a316f61d0" containerName="extract-utilities" Mar 18 16:42:00 crc kubenswrapper[4696]: I0318 16:42:00.177670 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="82229e9c-40c8-406c-a4a7-5f5a316f61d0" containerName="extract-utilities" Mar 18 16:42:00 crc kubenswrapper[4696]: E0318 16:42:00.177703 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82229e9c-40c8-406c-a4a7-5f5a316f61d0" containerName="registry-server" Mar 18 16:42:00 crc kubenswrapper[4696]: I0318 16:42:00.177710 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="82229e9c-40c8-406c-a4a7-5f5a316f61d0" containerName="registry-server" Mar 18 16:42:00 crc kubenswrapper[4696]: E0318 16:42:00.177726 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7293f441-53f0-4872-9d79-c1198766aa86" containerName="gather" Mar 18 16:42:00 crc kubenswrapper[4696]: I0318 16:42:00.177734 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="7293f441-53f0-4872-9d79-c1198766aa86" containerName="gather" Mar 18 16:42:00 crc kubenswrapper[4696]: I0318 16:42:00.177988 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="7293f441-53f0-4872-9d79-c1198766aa86" containerName="gather" Mar 18 16:42:00 crc kubenswrapper[4696]: I0318 16:42:00.178008 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="7293f441-53f0-4872-9d79-c1198766aa86" containerName="copy" Mar 18 16:42:00 crc kubenswrapper[4696]: I0318 16:42:00.178027 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="82229e9c-40c8-406c-a4a7-5f5a316f61d0" containerName="registry-server" Mar 18 16:42:00 crc kubenswrapper[4696]: I0318 16:42:00.178799 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564202-p2zwh" Mar 18 16:42:00 crc kubenswrapper[4696]: I0318 16:42:00.181979 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:42:00 crc kubenswrapper[4696]: I0318 16:42:00.184644 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:42:00 crc kubenswrapper[4696]: I0318 16:42:00.190157 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmqxq\" (UniqueName: \"kubernetes.io/projected/d625c9e2-dc61-4c51-ad5d-5d38203c3739-kube-api-access-hmqxq\") pod \"auto-csr-approver-29564202-p2zwh\" (UID: \"d625c9e2-dc61-4c51-ad5d-5d38203c3739\") " pod="openshift-infra/auto-csr-approver-29564202-p2zwh" Mar 18 16:42:00 crc kubenswrapper[4696]: I0318 16:42:00.191365 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564202-p2zwh"] Mar 18 16:42:00 crc kubenswrapper[4696]: I0318 16:42:00.193344 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:42:00 crc kubenswrapper[4696]: I0318 16:42:00.295055 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmqxq\" (UniqueName: \"kubernetes.io/projected/d625c9e2-dc61-4c51-ad5d-5d38203c3739-kube-api-access-hmqxq\") pod \"auto-csr-approver-29564202-p2zwh\" (UID: \"d625c9e2-dc61-4c51-ad5d-5d38203c3739\") " pod="openshift-infra/auto-csr-approver-29564202-p2zwh" Mar 18 16:42:00 crc kubenswrapper[4696]: I0318 16:42:00.323684 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmqxq\" (UniqueName: \"kubernetes.io/projected/d625c9e2-dc61-4c51-ad5d-5d38203c3739-kube-api-access-hmqxq\") pod \"auto-csr-approver-29564202-p2zwh\" (UID: \"d625c9e2-dc61-4c51-ad5d-5d38203c3739\") " pod="openshift-infra/auto-csr-approver-29564202-p2zwh" Mar 18 16:42:00 crc kubenswrapper[4696]: I0318 16:42:00.498245 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564202-p2zwh" Mar 18 16:42:00 crc kubenswrapper[4696]: I0318 16:42:00.955599 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564202-p2zwh"] Mar 18 16:42:01 crc kubenswrapper[4696]: I0318 16:42:01.697257 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564202-p2zwh" event={"ID":"d625c9e2-dc61-4c51-ad5d-5d38203c3739","Type":"ContainerStarted","Data":"8152b312fbd2cee92222ca92b178710d63289915f23c5007a2cd5f23b461a6fc"} Mar 18 16:42:02 crc kubenswrapper[4696]: I0318 16:42:02.707330 4696 generic.go:334] "Generic (PLEG): container finished" podID="d625c9e2-dc61-4c51-ad5d-5d38203c3739" containerID="c9f027914cf1a7e9dd9c570f472cec519a4a2481e23f87f1b268461f6106e893" exitCode=0 Mar 18 16:42:02 crc kubenswrapper[4696]: I0318 16:42:02.707377 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564202-p2zwh" event={"ID":"d625c9e2-dc61-4c51-ad5d-5d38203c3739","Type":"ContainerDied","Data":"c9f027914cf1a7e9dd9c570f472cec519a4a2481e23f87f1b268461f6106e893"} Mar 18 16:42:04 crc kubenswrapper[4696]: I0318 16:42:04.124898 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564202-p2zwh" Mar 18 16:42:04 crc kubenswrapper[4696]: I0318 16:42:04.164535 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmqxq\" (UniqueName: \"kubernetes.io/projected/d625c9e2-dc61-4c51-ad5d-5d38203c3739-kube-api-access-hmqxq\") pod \"d625c9e2-dc61-4c51-ad5d-5d38203c3739\" (UID: \"d625c9e2-dc61-4c51-ad5d-5d38203c3739\") " Mar 18 16:42:04 crc kubenswrapper[4696]: I0318 16:42:04.171380 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d625c9e2-dc61-4c51-ad5d-5d38203c3739-kube-api-access-hmqxq" (OuterVolumeSpecName: "kube-api-access-hmqxq") pod "d625c9e2-dc61-4c51-ad5d-5d38203c3739" (UID: "d625c9e2-dc61-4c51-ad5d-5d38203c3739"). InnerVolumeSpecName "kube-api-access-hmqxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:42:04 crc kubenswrapper[4696]: I0318 16:42:04.268837 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmqxq\" (UniqueName: \"kubernetes.io/projected/d625c9e2-dc61-4c51-ad5d-5d38203c3739-kube-api-access-hmqxq\") on node \"crc\" DevicePath \"\"" Mar 18 16:42:04 crc kubenswrapper[4696]: I0318 16:42:04.597139 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:42:04 crc kubenswrapper[4696]: E0318 16:42:04.597510 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:42:04 crc kubenswrapper[4696]: I0318 16:42:04.725908 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564202-p2zwh" event={"ID":"d625c9e2-dc61-4c51-ad5d-5d38203c3739","Type":"ContainerDied","Data":"8152b312fbd2cee92222ca92b178710d63289915f23c5007a2cd5f23b461a6fc"} Mar 18 16:42:04 crc kubenswrapper[4696]: I0318 16:42:04.725950 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8152b312fbd2cee92222ca92b178710d63289915f23c5007a2cd5f23b461a6fc" Mar 18 16:42:04 crc kubenswrapper[4696]: I0318 16:42:04.725984 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564202-p2zwh" Mar 18 16:42:05 crc kubenswrapper[4696]: I0318 16:42:05.197718 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564196-7ph7t"] Mar 18 16:42:05 crc kubenswrapper[4696]: I0318 16:42:05.211560 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564196-7ph7t"] Mar 18 16:42:05 crc kubenswrapper[4696]: I0318 16:42:05.610736 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a00ab367-0a0b-44c1-bd91-d54036751bc3" path="/var/lib/kubelet/pods/a00ab367-0a0b-44c1-bd91-d54036751bc3/volumes" Mar 18 16:42:16 crc kubenswrapper[4696]: I0318 16:42:16.597380 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:42:16 crc kubenswrapper[4696]: E0318 16:42:16.598247 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:42:25 crc kubenswrapper[4696]: I0318 16:42:25.879689 4696 scope.go:117] "RemoveContainer" containerID="46c125917c31de0740cf1a058d88adcbf327d40dc6a7e118d09a377126f31485" Mar 18 16:42:29 crc kubenswrapper[4696]: I0318 16:42:29.597814 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:42:29 crc kubenswrapper[4696]: E0318 16:42:29.598680 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:42:34 crc kubenswrapper[4696]: I0318 16:42:34.503728 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nmtf4"] Mar 18 16:42:34 crc kubenswrapper[4696]: E0318 16:42:34.504684 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d625c9e2-dc61-4c51-ad5d-5d38203c3739" containerName="oc" Mar 18 16:42:34 crc kubenswrapper[4696]: I0318 16:42:34.504699 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="d625c9e2-dc61-4c51-ad5d-5d38203c3739" containerName="oc" Mar 18 16:42:34 crc kubenswrapper[4696]: I0318 16:42:34.504888 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="d625c9e2-dc61-4c51-ad5d-5d38203c3739" containerName="oc" Mar 18 16:42:34 crc kubenswrapper[4696]: I0318 16:42:34.508676 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nmtf4" Mar 18 16:42:34 crc kubenswrapper[4696]: I0318 16:42:34.522828 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nmtf4"] Mar 18 16:42:34 crc kubenswrapper[4696]: I0318 16:42:34.660128 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xtnz\" (UniqueName: \"kubernetes.io/projected/380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e-kube-api-access-7xtnz\") pod \"redhat-operators-nmtf4\" (UID: \"380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e\") " pod="openshift-marketplace/redhat-operators-nmtf4" Mar 18 16:42:34 crc kubenswrapper[4696]: I0318 16:42:34.660318 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e-catalog-content\") pod \"redhat-operators-nmtf4\" (UID: \"380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e\") " pod="openshift-marketplace/redhat-operators-nmtf4" Mar 18 16:42:34 crc kubenswrapper[4696]: I0318 16:42:34.661137 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e-utilities\") pod \"redhat-operators-nmtf4\" (UID: \"380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e\") " pod="openshift-marketplace/redhat-operators-nmtf4" Mar 18 16:42:34 crc kubenswrapper[4696]: I0318 16:42:34.763592 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e-catalog-content\") pod \"redhat-operators-nmtf4\" (UID: \"380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e\") " pod="openshift-marketplace/redhat-operators-nmtf4" Mar 18 16:42:34 crc kubenswrapper[4696]: I0318 16:42:34.763732 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e-utilities\") pod \"redhat-operators-nmtf4\" (UID: \"380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e\") " pod="openshift-marketplace/redhat-operators-nmtf4" Mar 18 16:42:34 crc kubenswrapper[4696]: I0318 16:42:34.763855 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xtnz\" (UniqueName: \"kubernetes.io/projected/380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e-kube-api-access-7xtnz\") pod \"redhat-operators-nmtf4\" (UID: \"380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e\") " pod="openshift-marketplace/redhat-operators-nmtf4" Mar 18 16:42:34 crc kubenswrapper[4696]: I0318 16:42:34.764218 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e-utilities\") pod \"redhat-operators-nmtf4\" (UID: \"380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e\") " pod="openshift-marketplace/redhat-operators-nmtf4" Mar 18 16:42:34 crc kubenswrapper[4696]: I0318 16:42:34.764829 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e-catalog-content\") pod \"redhat-operators-nmtf4\" (UID: \"380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e\") " pod="openshift-marketplace/redhat-operators-nmtf4" Mar 18 16:42:34 crc kubenswrapper[4696]: I0318 16:42:34.785002 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xtnz\" (UniqueName: \"kubernetes.io/projected/380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e-kube-api-access-7xtnz\") pod \"redhat-operators-nmtf4\" (UID: \"380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e\") " pod="openshift-marketplace/redhat-operators-nmtf4" Mar 18 16:42:34 crc kubenswrapper[4696]: I0318 16:42:34.831390 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nmtf4" Mar 18 16:42:35 crc kubenswrapper[4696]: I0318 16:42:35.283964 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nmtf4"] Mar 18 16:42:36 crc kubenswrapper[4696]: I0318 16:42:36.002205 4696 generic.go:334] "Generic (PLEG): container finished" podID="380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e" containerID="a6da0a16d21043b43544e9c55b7da4a93d3bb96177d08aad0246dbfd7aec2799" exitCode=0 Mar 18 16:42:36 crc kubenswrapper[4696]: I0318 16:42:36.002246 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmtf4" event={"ID":"380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e","Type":"ContainerDied","Data":"a6da0a16d21043b43544e9c55b7da4a93d3bb96177d08aad0246dbfd7aec2799"} Mar 18 16:42:36 crc kubenswrapper[4696]: I0318 16:42:36.002268 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmtf4" event={"ID":"380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e","Type":"ContainerStarted","Data":"e350ce138066292f8661b8a40eb5240742807bb230b71ed6ec40358d1f42e156"} Mar 18 16:42:37 crc kubenswrapper[4696]: I0318 16:42:37.012116 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmtf4" event={"ID":"380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e","Type":"ContainerStarted","Data":"e223d78f8df7f87eb8ff8f331a08817496d2ec9eee341e3c43b5199f53003015"} Mar 18 16:42:41 crc kubenswrapper[4696]: I0318 16:42:41.050025 4696 generic.go:334] "Generic (PLEG): container finished" podID="380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e" containerID="e223d78f8df7f87eb8ff8f331a08817496d2ec9eee341e3c43b5199f53003015" exitCode=0 Mar 18 16:42:41 crc kubenswrapper[4696]: I0318 16:42:41.050100 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmtf4" event={"ID":"380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e","Type":"ContainerDied","Data":"e223d78f8df7f87eb8ff8f331a08817496d2ec9eee341e3c43b5199f53003015"} Mar 18 16:42:42 crc kubenswrapper[4696]: I0318 16:42:42.047106 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-6847f4969-jlnz4" podUID="2418339a-4137-4f64-b098-f0e5011d3f61" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 18 16:42:42 crc kubenswrapper[4696]: I0318 16:42:42.060452 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmtf4" event={"ID":"380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e","Type":"ContainerStarted","Data":"81c62246aff94108aaa70c60d0820cd777ade206c92da28dce3624b6e77f5710"} Mar 18 16:42:42 crc kubenswrapper[4696]: I0318 16:42:42.081308 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nmtf4" podStartSLOduration=2.332000464 podStartE2EDuration="8.081288424s" podCreationTimestamp="2026-03-18 16:42:34 +0000 UTC" firstStartedPulling="2026-03-18 16:42:36.00470163 +0000 UTC m=+3999.010875836" lastFinishedPulling="2026-03-18 16:42:41.75398959 +0000 UTC m=+4004.760163796" observedRunningTime="2026-03-18 16:42:42.076268118 +0000 UTC m=+4005.082442334" watchObservedRunningTime="2026-03-18 16:42:42.081288424 +0000 UTC m=+4005.087462630" Mar 18 16:42:42 crc kubenswrapper[4696]: I0318 16:42:42.597299 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:42:42 crc kubenswrapper[4696]: E0318 16:42:42.597787 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:42:44 crc kubenswrapper[4696]: I0318 16:42:44.832183 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nmtf4" Mar 18 16:42:44 crc kubenswrapper[4696]: I0318 16:42:44.832529 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nmtf4" Mar 18 16:42:45 crc kubenswrapper[4696]: I0318 16:42:45.876503 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nmtf4" podUID="380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e" containerName="registry-server" probeResult="failure" output=< Mar 18 16:42:45 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Mar 18 16:42:45 crc kubenswrapper[4696]: > Mar 18 16:42:53 crc kubenswrapper[4696]: I0318 16:42:53.600384 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:42:53 crc kubenswrapper[4696]: E0318 16:42:53.601365 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:42:54 crc kubenswrapper[4696]: I0318 16:42:54.881254 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nmtf4" Mar 18 16:42:54 crc kubenswrapper[4696]: I0318 16:42:54.929238 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nmtf4" Mar 18 16:42:55 crc kubenswrapper[4696]: I0318 16:42:55.116927 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nmtf4"] Mar 18 16:42:56 crc kubenswrapper[4696]: I0318 16:42:56.184250 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nmtf4" podUID="380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e" containerName="registry-server" containerID="cri-o://81c62246aff94108aaa70c60d0820cd777ade206c92da28dce3624b6e77f5710" gracePeriod=2 Mar 18 16:42:56 crc kubenswrapper[4696]: I0318 16:42:56.705734 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nmtf4" Mar 18 16:42:56 crc kubenswrapper[4696]: I0318 16:42:56.874462 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xtnz\" (UniqueName: \"kubernetes.io/projected/380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e-kube-api-access-7xtnz\") pod \"380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e\" (UID: \"380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e\") " Mar 18 16:42:56 crc kubenswrapper[4696]: I0318 16:42:56.874895 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e-utilities\") pod \"380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e\" (UID: \"380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e\") " Mar 18 16:42:56 crc kubenswrapper[4696]: I0318 16:42:56.874993 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e-catalog-content\") pod \"380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e\" (UID: \"380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e\") " Mar 18 16:42:56 crc kubenswrapper[4696]: I0318 16:42:56.875497 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e-utilities" (OuterVolumeSpecName: "utilities") pod "380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e" (UID: "380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:42:56 crc kubenswrapper[4696]: I0318 16:42:56.880829 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e-kube-api-access-7xtnz" (OuterVolumeSpecName: "kube-api-access-7xtnz") pod "380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e" (UID: "380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e"). InnerVolumeSpecName "kube-api-access-7xtnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:42:56 crc kubenswrapper[4696]: I0318 16:42:56.977103 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xtnz\" (UniqueName: \"kubernetes.io/projected/380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e-kube-api-access-7xtnz\") on node \"crc\" DevicePath \"\"" Mar 18 16:42:56 crc kubenswrapper[4696]: I0318 16:42:56.977161 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:42:57 crc kubenswrapper[4696]: I0318 16:42:57.005249 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e" (UID: "380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:42:57 crc kubenswrapper[4696]: I0318 16:42:57.078530 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:42:57 crc kubenswrapper[4696]: I0318 16:42:57.199080 4696 generic.go:334] "Generic (PLEG): container finished" podID="380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e" containerID="81c62246aff94108aaa70c60d0820cd777ade206c92da28dce3624b6e77f5710" exitCode=0 Mar 18 16:42:57 crc kubenswrapper[4696]: I0318 16:42:57.199156 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmtf4" event={"ID":"380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e","Type":"ContainerDied","Data":"81c62246aff94108aaa70c60d0820cd777ade206c92da28dce3624b6e77f5710"} Mar 18 16:42:57 crc kubenswrapper[4696]: I0318 16:42:57.199182 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nmtf4" Mar 18 16:42:57 crc kubenswrapper[4696]: I0318 16:42:57.199207 4696 scope.go:117] "RemoveContainer" containerID="81c62246aff94108aaa70c60d0820cd777ade206c92da28dce3624b6e77f5710" Mar 18 16:42:57 crc kubenswrapper[4696]: I0318 16:42:57.199194 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmtf4" event={"ID":"380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e","Type":"ContainerDied","Data":"e350ce138066292f8661b8a40eb5240742807bb230b71ed6ec40358d1f42e156"} Mar 18 16:42:57 crc kubenswrapper[4696]: I0318 16:42:57.225747 4696 scope.go:117] "RemoveContainer" containerID="e223d78f8df7f87eb8ff8f331a08817496d2ec9eee341e3c43b5199f53003015" Mar 18 16:42:57 crc kubenswrapper[4696]: I0318 16:42:57.241462 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nmtf4"] Mar 18 16:42:57 crc kubenswrapper[4696]: I0318 16:42:57.250504 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nmtf4"] Mar 18 16:42:57 crc kubenswrapper[4696]: I0318 16:42:57.275231 4696 scope.go:117] "RemoveContainer" containerID="a6da0a16d21043b43544e9c55b7da4a93d3bb96177d08aad0246dbfd7aec2799" Mar 18 16:42:57 crc kubenswrapper[4696]: I0318 16:42:57.298439 4696 scope.go:117] "RemoveContainer" containerID="81c62246aff94108aaa70c60d0820cd777ade206c92da28dce3624b6e77f5710" Mar 18 16:42:57 crc kubenswrapper[4696]: E0318 16:42:57.298982 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81c62246aff94108aaa70c60d0820cd777ade206c92da28dce3624b6e77f5710\": container with ID starting with 81c62246aff94108aaa70c60d0820cd777ade206c92da28dce3624b6e77f5710 not found: ID does not exist" containerID="81c62246aff94108aaa70c60d0820cd777ade206c92da28dce3624b6e77f5710" Mar 18 16:42:57 crc kubenswrapper[4696]: I0318 16:42:57.299018 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81c62246aff94108aaa70c60d0820cd777ade206c92da28dce3624b6e77f5710"} err="failed to get container status \"81c62246aff94108aaa70c60d0820cd777ade206c92da28dce3624b6e77f5710\": rpc error: code = NotFound desc = could not find container \"81c62246aff94108aaa70c60d0820cd777ade206c92da28dce3624b6e77f5710\": container with ID starting with 81c62246aff94108aaa70c60d0820cd777ade206c92da28dce3624b6e77f5710 not found: ID does not exist" Mar 18 16:42:57 crc kubenswrapper[4696]: I0318 16:42:57.299037 4696 scope.go:117] "RemoveContainer" containerID="e223d78f8df7f87eb8ff8f331a08817496d2ec9eee341e3c43b5199f53003015" Mar 18 16:42:57 crc kubenswrapper[4696]: E0318 16:42:57.299256 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e223d78f8df7f87eb8ff8f331a08817496d2ec9eee341e3c43b5199f53003015\": container with ID starting with e223d78f8df7f87eb8ff8f331a08817496d2ec9eee341e3c43b5199f53003015 not found: ID does not exist" containerID="e223d78f8df7f87eb8ff8f331a08817496d2ec9eee341e3c43b5199f53003015" Mar 18 16:42:57 crc kubenswrapper[4696]: I0318 16:42:57.299275 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e223d78f8df7f87eb8ff8f331a08817496d2ec9eee341e3c43b5199f53003015"} err="failed to get container status \"e223d78f8df7f87eb8ff8f331a08817496d2ec9eee341e3c43b5199f53003015\": rpc error: code = NotFound desc = could not find container \"e223d78f8df7f87eb8ff8f331a08817496d2ec9eee341e3c43b5199f53003015\": container with ID starting with e223d78f8df7f87eb8ff8f331a08817496d2ec9eee341e3c43b5199f53003015 not found: ID does not exist" Mar 18 16:42:57 crc kubenswrapper[4696]: I0318 16:42:57.299286 4696 scope.go:117] "RemoveContainer" containerID="a6da0a16d21043b43544e9c55b7da4a93d3bb96177d08aad0246dbfd7aec2799" Mar 18 16:42:57 crc kubenswrapper[4696]: E0318 16:42:57.299478 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6da0a16d21043b43544e9c55b7da4a93d3bb96177d08aad0246dbfd7aec2799\": container with ID starting with a6da0a16d21043b43544e9c55b7da4a93d3bb96177d08aad0246dbfd7aec2799 not found: ID does not exist" containerID="a6da0a16d21043b43544e9c55b7da4a93d3bb96177d08aad0246dbfd7aec2799" Mar 18 16:42:57 crc kubenswrapper[4696]: I0318 16:42:57.299494 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6da0a16d21043b43544e9c55b7da4a93d3bb96177d08aad0246dbfd7aec2799"} err="failed to get container status \"a6da0a16d21043b43544e9c55b7da4a93d3bb96177d08aad0246dbfd7aec2799\": rpc error: code = NotFound desc = could not find container \"a6da0a16d21043b43544e9c55b7da4a93d3bb96177d08aad0246dbfd7aec2799\": container with ID starting with a6da0a16d21043b43544e9c55b7da4a93d3bb96177d08aad0246dbfd7aec2799 not found: ID does not exist" Mar 18 16:42:57 crc kubenswrapper[4696]: I0318 16:42:57.610748 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e" path="/var/lib/kubelet/pods/380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e/volumes" Mar 18 16:43:07 crc kubenswrapper[4696]: I0318 16:43:07.610012 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:43:07 crc kubenswrapper[4696]: E0318 16:43:07.610896 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:43:18 crc kubenswrapper[4696]: I0318 16:43:18.597888 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:43:19 crc kubenswrapper[4696]: I0318 16:43:19.385916 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerStarted","Data":"83f8bb1de200eeb57fe6463ccfbd7d698a465fd68278d22a98885c91af6e914f"} Mar 18 16:44:00 crc kubenswrapper[4696]: I0318 16:44:00.143545 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564204-6mtpb"] Mar 18 16:44:00 crc kubenswrapper[4696]: E0318 16:44:00.144642 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e" containerName="extract-content" Mar 18 16:44:00 crc kubenswrapper[4696]: I0318 16:44:00.144660 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e" containerName="extract-content" Mar 18 16:44:00 crc kubenswrapper[4696]: E0318 16:44:00.144693 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e" containerName="extract-utilities" Mar 18 16:44:00 crc kubenswrapper[4696]: I0318 16:44:00.144701 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e" containerName="extract-utilities" Mar 18 16:44:00 crc kubenswrapper[4696]: E0318 16:44:00.144736 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e" containerName="registry-server" Mar 18 16:44:00 crc kubenswrapper[4696]: I0318 16:44:00.144744 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e" containerName="registry-server" Mar 18 16:44:00 crc kubenswrapper[4696]: I0318 16:44:00.144960 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="380a9b2a-9a9f-4c5e-abfa-5b3857cdfb6e" containerName="registry-server" Mar 18 16:44:00 crc kubenswrapper[4696]: I0318 16:44:00.145771 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564204-6mtpb" Mar 18 16:44:00 crc kubenswrapper[4696]: I0318 16:44:00.147947 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:44:00 crc kubenswrapper[4696]: I0318 16:44:00.148183 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:44:00 crc kubenswrapper[4696]: I0318 16:44:00.149563 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:44:00 crc kubenswrapper[4696]: I0318 16:44:00.156688 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564204-6mtpb"] Mar 18 16:44:00 crc kubenswrapper[4696]: I0318 16:44:00.311865 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q866q\" (UniqueName: \"kubernetes.io/projected/10880ed9-e38d-45c2-8267-37ef99615c30-kube-api-access-q866q\") pod \"auto-csr-approver-29564204-6mtpb\" (UID: \"10880ed9-e38d-45c2-8267-37ef99615c30\") " pod="openshift-infra/auto-csr-approver-29564204-6mtpb" Mar 18 16:44:00 crc kubenswrapper[4696]: I0318 16:44:00.413562 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q866q\" (UniqueName: \"kubernetes.io/projected/10880ed9-e38d-45c2-8267-37ef99615c30-kube-api-access-q866q\") pod \"auto-csr-approver-29564204-6mtpb\" (UID: \"10880ed9-e38d-45c2-8267-37ef99615c30\") " pod="openshift-infra/auto-csr-approver-29564204-6mtpb" Mar 18 16:44:00 crc kubenswrapper[4696]: I0318 16:44:00.851366 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q866q\" (UniqueName: \"kubernetes.io/projected/10880ed9-e38d-45c2-8267-37ef99615c30-kube-api-access-q866q\") pod \"auto-csr-approver-29564204-6mtpb\" (UID: \"10880ed9-e38d-45c2-8267-37ef99615c30\") " pod="openshift-infra/auto-csr-approver-29564204-6mtpb" Mar 18 16:44:01 crc kubenswrapper[4696]: I0318 16:44:01.071990 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564204-6mtpb" Mar 18 16:44:01 crc kubenswrapper[4696]: I0318 16:44:01.544055 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564204-6mtpb"] Mar 18 16:44:01 crc kubenswrapper[4696]: I0318 16:44:01.818053 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564204-6mtpb" event={"ID":"10880ed9-e38d-45c2-8267-37ef99615c30","Type":"ContainerStarted","Data":"ade5416a036624f9fea7854add435c42f18f83022578dac48a25e1bc9ffc0678"} Mar 18 16:44:03 crc kubenswrapper[4696]: I0318 16:44:03.836006 4696 generic.go:334] "Generic (PLEG): container finished" podID="10880ed9-e38d-45c2-8267-37ef99615c30" containerID="447387eae2a43567fe2682c878ece69baa63c120f2c9628c05c8d6bbc092462d" exitCode=0 Mar 18 16:44:03 crc kubenswrapper[4696]: I0318 16:44:03.836107 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564204-6mtpb" event={"ID":"10880ed9-e38d-45c2-8267-37ef99615c30","Type":"ContainerDied","Data":"447387eae2a43567fe2682c878ece69baa63c120f2c9628c05c8d6bbc092462d"} Mar 18 16:44:05 crc kubenswrapper[4696]: I0318 16:44:05.133698 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564204-6mtpb" Mar 18 16:44:05 crc kubenswrapper[4696]: I0318 16:44:05.217205 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q866q\" (UniqueName: \"kubernetes.io/projected/10880ed9-e38d-45c2-8267-37ef99615c30-kube-api-access-q866q\") pod \"10880ed9-e38d-45c2-8267-37ef99615c30\" (UID: \"10880ed9-e38d-45c2-8267-37ef99615c30\") " Mar 18 16:44:05 crc kubenswrapper[4696]: I0318 16:44:05.223229 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10880ed9-e38d-45c2-8267-37ef99615c30-kube-api-access-q866q" (OuterVolumeSpecName: "kube-api-access-q866q") pod "10880ed9-e38d-45c2-8267-37ef99615c30" (UID: "10880ed9-e38d-45c2-8267-37ef99615c30"). InnerVolumeSpecName "kube-api-access-q866q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:44:05 crc kubenswrapper[4696]: I0318 16:44:05.319772 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q866q\" (UniqueName: \"kubernetes.io/projected/10880ed9-e38d-45c2-8267-37ef99615c30-kube-api-access-q866q\") on node \"crc\" DevicePath \"\"" Mar 18 16:44:05 crc kubenswrapper[4696]: I0318 16:44:05.859942 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564204-6mtpb" event={"ID":"10880ed9-e38d-45c2-8267-37ef99615c30","Type":"ContainerDied","Data":"ade5416a036624f9fea7854add435c42f18f83022578dac48a25e1bc9ffc0678"} Mar 18 16:44:05 crc kubenswrapper[4696]: I0318 16:44:05.860257 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ade5416a036624f9fea7854add435c42f18f83022578dac48a25e1bc9ffc0678" Mar 18 16:44:05 crc kubenswrapper[4696]: I0318 16:44:05.860005 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564204-6mtpb" Mar 18 16:44:06 crc kubenswrapper[4696]: I0318 16:44:06.191811 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564198-nglr8"] Mar 18 16:44:06 crc kubenswrapper[4696]: I0318 16:44:06.199188 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564198-nglr8"] Mar 18 16:44:07 crc kubenswrapper[4696]: I0318 16:44:07.608871 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c1dc242-7619-477c-8bbf-c5b6cc2c7805" path="/var/lib/kubelet/pods/7c1dc242-7619-477c-8bbf-c5b6cc2c7805/volumes" Mar 18 16:44:15 crc kubenswrapper[4696]: I0318 16:44:15.376694 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qgh2v"] Mar 18 16:44:15 crc kubenswrapper[4696]: E0318 16:44:15.377797 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10880ed9-e38d-45c2-8267-37ef99615c30" containerName="oc" Mar 18 16:44:15 crc kubenswrapper[4696]: I0318 16:44:15.377812 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="10880ed9-e38d-45c2-8267-37ef99615c30" containerName="oc" Mar 18 16:44:15 crc kubenswrapper[4696]: I0318 16:44:15.377982 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="10880ed9-e38d-45c2-8267-37ef99615c30" containerName="oc" Mar 18 16:44:15 crc kubenswrapper[4696]: I0318 16:44:15.379858 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qgh2v" Mar 18 16:44:15 crc kubenswrapper[4696]: I0318 16:44:15.399228 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qgh2v"] Mar 18 16:44:15 crc kubenswrapper[4696]: I0318 16:44:15.570713 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c57b8a37-6ba6-4952-b4fd-3937690b4376-catalog-content\") pod \"certified-operators-qgh2v\" (UID: \"c57b8a37-6ba6-4952-b4fd-3937690b4376\") " pod="openshift-marketplace/certified-operators-qgh2v" Mar 18 16:44:15 crc kubenswrapper[4696]: I0318 16:44:15.571035 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c57b8a37-6ba6-4952-b4fd-3937690b4376-utilities\") pod \"certified-operators-qgh2v\" (UID: \"c57b8a37-6ba6-4952-b4fd-3937690b4376\") " pod="openshift-marketplace/certified-operators-qgh2v" Mar 18 16:44:15 crc kubenswrapper[4696]: I0318 16:44:15.571104 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tbfj\" (UniqueName: \"kubernetes.io/projected/c57b8a37-6ba6-4952-b4fd-3937690b4376-kube-api-access-9tbfj\") pod \"certified-operators-qgh2v\" (UID: \"c57b8a37-6ba6-4952-b4fd-3937690b4376\") " pod="openshift-marketplace/certified-operators-qgh2v" Mar 18 16:44:15 crc kubenswrapper[4696]: I0318 16:44:15.672754 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tbfj\" (UniqueName: \"kubernetes.io/projected/c57b8a37-6ba6-4952-b4fd-3937690b4376-kube-api-access-9tbfj\") pod \"certified-operators-qgh2v\" (UID: \"c57b8a37-6ba6-4952-b4fd-3937690b4376\") " pod="openshift-marketplace/certified-operators-qgh2v" Mar 18 16:44:15 crc kubenswrapper[4696]: I0318 16:44:15.672921 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c57b8a37-6ba6-4952-b4fd-3937690b4376-catalog-content\") pod \"certified-operators-qgh2v\" (UID: \"c57b8a37-6ba6-4952-b4fd-3937690b4376\") " pod="openshift-marketplace/certified-operators-qgh2v" Mar 18 16:44:15 crc kubenswrapper[4696]: I0318 16:44:15.672952 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c57b8a37-6ba6-4952-b4fd-3937690b4376-utilities\") pod \"certified-operators-qgh2v\" (UID: \"c57b8a37-6ba6-4952-b4fd-3937690b4376\") " pod="openshift-marketplace/certified-operators-qgh2v" Mar 18 16:44:15 crc kubenswrapper[4696]: I0318 16:44:15.673475 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c57b8a37-6ba6-4952-b4fd-3937690b4376-utilities\") pod \"certified-operators-qgh2v\" (UID: \"c57b8a37-6ba6-4952-b4fd-3937690b4376\") " pod="openshift-marketplace/certified-operators-qgh2v" Mar 18 16:44:15 crc kubenswrapper[4696]: I0318 16:44:15.673515 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c57b8a37-6ba6-4952-b4fd-3937690b4376-catalog-content\") pod \"certified-operators-qgh2v\" (UID: \"c57b8a37-6ba6-4952-b4fd-3937690b4376\") " pod="openshift-marketplace/certified-operators-qgh2v" Mar 18 16:44:15 crc kubenswrapper[4696]: I0318 16:44:15.706059 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tbfj\" (UniqueName: \"kubernetes.io/projected/c57b8a37-6ba6-4952-b4fd-3937690b4376-kube-api-access-9tbfj\") pod \"certified-operators-qgh2v\" (UID: \"c57b8a37-6ba6-4952-b4fd-3937690b4376\") " pod="openshift-marketplace/certified-operators-qgh2v" Mar 18 16:44:15 crc kubenswrapper[4696]: I0318 16:44:15.744984 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qgh2v" Mar 18 16:44:16 crc kubenswrapper[4696]: I0318 16:44:16.277855 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qgh2v"] Mar 18 16:44:16 crc kubenswrapper[4696]: W0318 16:44:16.282704 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc57b8a37_6ba6_4952_b4fd_3937690b4376.slice/crio-2c67fbc9e0427385b5b684349bc09abc4ab5f89e040da24496d90b3b073c293f WatchSource:0}: Error finding container 2c67fbc9e0427385b5b684349bc09abc4ab5f89e040da24496d90b3b073c293f: Status 404 returned error can't find the container with id 2c67fbc9e0427385b5b684349bc09abc4ab5f89e040da24496d90b3b073c293f Mar 18 16:44:16 crc kubenswrapper[4696]: I0318 16:44:16.963027 4696 generic.go:334] "Generic (PLEG): container finished" podID="c57b8a37-6ba6-4952-b4fd-3937690b4376" containerID="22d230f12abf023daeb0fe11ccf44bebd45f23a77e028ab61460a1e94bc5ce3c" exitCode=0 Mar 18 16:44:16 crc kubenswrapper[4696]: I0318 16:44:16.963119 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qgh2v" event={"ID":"c57b8a37-6ba6-4952-b4fd-3937690b4376","Type":"ContainerDied","Data":"22d230f12abf023daeb0fe11ccf44bebd45f23a77e028ab61460a1e94bc5ce3c"} Mar 18 16:44:16 crc kubenswrapper[4696]: I0318 16:44:16.963355 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qgh2v" event={"ID":"c57b8a37-6ba6-4952-b4fd-3937690b4376","Type":"ContainerStarted","Data":"2c67fbc9e0427385b5b684349bc09abc4ab5f89e040da24496d90b3b073c293f"} Mar 18 16:44:18 crc kubenswrapper[4696]: I0318 16:44:18.984857 4696 generic.go:334] "Generic (PLEG): container finished" podID="c57b8a37-6ba6-4952-b4fd-3937690b4376" containerID="614e6218be0a5bfe278ee0625013da82e9c46243ea5267091996c0584dc29cfc" exitCode=0 Mar 18 16:44:18 crc kubenswrapper[4696]: I0318 16:44:18.984921 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qgh2v" event={"ID":"c57b8a37-6ba6-4952-b4fd-3937690b4376","Type":"ContainerDied","Data":"614e6218be0a5bfe278ee0625013da82e9c46243ea5267091996c0584dc29cfc"} Mar 18 16:44:19 crc kubenswrapper[4696]: I0318 16:44:19.998659 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qgh2v" event={"ID":"c57b8a37-6ba6-4952-b4fd-3937690b4376","Type":"ContainerStarted","Data":"aa47e08c86b0b9c3a9fce054d37bdc66557d36e8a0e8c73c8465e12291801e88"} Mar 18 16:44:20 crc kubenswrapper[4696]: I0318 16:44:20.026440 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qgh2v" podStartSLOduration=2.627133861 podStartE2EDuration="5.026414575s" podCreationTimestamp="2026-03-18 16:44:15 +0000 UTC" firstStartedPulling="2026-03-18 16:44:16.967344323 +0000 UTC m=+4099.973518529" lastFinishedPulling="2026-03-18 16:44:19.366625037 +0000 UTC m=+4102.372799243" observedRunningTime="2026-03-18 16:44:20.020845315 +0000 UTC m=+4103.027019531" watchObservedRunningTime="2026-03-18 16:44:20.026414575 +0000 UTC m=+4103.032588781" Mar 18 16:44:25 crc kubenswrapper[4696]: I0318 16:44:25.746003 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qgh2v" Mar 18 16:44:25 crc kubenswrapper[4696]: I0318 16:44:25.746511 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qgh2v" Mar 18 16:44:25 crc kubenswrapper[4696]: I0318 16:44:25.787494 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qgh2v" Mar 18 16:44:26 crc kubenswrapper[4696]: I0318 16:44:26.009572 4696 scope.go:117] "RemoveContainer" containerID="d55797574f830c287dc9c3b1d3de8bab805a740b5af4d3557869dfa49c1ffc10" Mar 18 16:44:26 crc kubenswrapper[4696]: I0318 16:44:26.098095 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qgh2v" Mar 18 16:44:26 crc kubenswrapper[4696]: I0318 16:44:26.165084 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qgh2v"] Mar 18 16:44:27 crc kubenswrapper[4696]: I0318 16:44:27.540456 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2zgrk/must-gather-cm5bk"] Mar 18 16:44:27 crc kubenswrapper[4696]: I0318 16:44:27.542255 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2zgrk/must-gather-cm5bk" Mar 18 16:44:27 crc kubenswrapper[4696]: I0318 16:44:27.544671 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2zgrk"/"openshift-service-ca.crt" Mar 18 16:44:27 crc kubenswrapper[4696]: I0318 16:44:27.545053 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2zgrk"/"kube-root-ca.crt" Mar 18 16:44:27 crc kubenswrapper[4696]: I0318 16:44:27.567407 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2zgrk/must-gather-cm5bk"] Mar 18 16:44:27 crc kubenswrapper[4696]: I0318 16:44:27.699915 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b7b100a7-0df9-496f-bb44-424109bd8c96-must-gather-output\") pod \"must-gather-cm5bk\" (UID: \"b7b100a7-0df9-496f-bb44-424109bd8c96\") " pod="openshift-must-gather-2zgrk/must-gather-cm5bk" Mar 18 16:44:27 crc kubenswrapper[4696]: I0318 16:44:27.699983 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzrnl\" (UniqueName: \"kubernetes.io/projected/b7b100a7-0df9-496f-bb44-424109bd8c96-kube-api-access-xzrnl\") pod \"must-gather-cm5bk\" (UID: \"b7b100a7-0df9-496f-bb44-424109bd8c96\") " pod="openshift-must-gather-2zgrk/must-gather-cm5bk" Mar 18 16:44:27 crc kubenswrapper[4696]: I0318 16:44:27.802044 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b7b100a7-0df9-496f-bb44-424109bd8c96-must-gather-output\") pod \"must-gather-cm5bk\" (UID: \"b7b100a7-0df9-496f-bb44-424109bd8c96\") " pod="openshift-must-gather-2zgrk/must-gather-cm5bk" Mar 18 16:44:27 crc kubenswrapper[4696]: I0318 16:44:27.802090 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzrnl\" (UniqueName: \"kubernetes.io/projected/b7b100a7-0df9-496f-bb44-424109bd8c96-kube-api-access-xzrnl\") pod \"must-gather-cm5bk\" (UID: \"b7b100a7-0df9-496f-bb44-424109bd8c96\") " pod="openshift-must-gather-2zgrk/must-gather-cm5bk" Mar 18 16:44:27 crc kubenswrapper[4696]: I0318 16:44:27.802876 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b7b100a7-0df9-496f-bb44-424109bd8c96-must-gather-output\") pod \"must-gather-cm5bk\" (UID: \"b7b100a7-0df9-496f-bb44-424109bd8c96\") " pod="openshift-must-gather-2zgrk/must-gather-cm5bk" Mar 18 16:44:27 crc kubenswrapper[4696]: I0318 16:44:27.824991 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzrnl\" (UniqueName: \"kubernetes.io/projected/b7b100a7-0df9-496f-bb44-424109bd8c96-kube-api-access-xzrnl\") pod \"must-gather-cm5bk\" (UID: \"b7b100a7-0df9-496f-bb44-424109bd8c96\") " pod="openshift-must-gather-2zgrk/must-gather-cm5bk" Mar 18 16:44:27 crc kubenswrapper[4696]: I0318 16:44:27.861508 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2zgrk/must-gather-cm5bk" Mar 18 16:44:28 crc kubenswrapper[4696]: I0318 16:44:28.069296 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qgh2v" podUID="c57b8a37-6ba6-4952-b4fd-3937690b4376" containerName="registry-server" containerID="cri-o://aa47e08c86b0b9c3a9fce054d37bdc66557d36e8a0e8c73c8465e12291801e88" gracePeriod=2 Mar 18 16:44:28 crc kubenswrapper[4696]: I0318 16:44:28.341579 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2zgrk/must-gather-cm5bk"] Mar 18 16:44:28 crc kubenswrapper[4696]: W0318 16:44:28.357787 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7b100a7_0df9_496f_bb44_424109bd8c96.slice/crio-5f861ffd4245b00c89fe53f18f5ad7e67f2c12fcab69f0c5470ecee5a32bc827 WatchSource:0}: Error finding container 5f861ffd4245b00c89fe53f18f5ad7e67f2c12fcab69f0c5470ecee5a32bc827: Status 404 returned error can't find the container with id 5f861ffd4245b00c89fe53f18f5ad7e67f2c12fcab69f0c5470ecee5a32bc827 Mar 18 16:44:28 crc kubenswrapper[4696]: I0318 16:44:28.466634 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qgh2v" Mar 18 16:44:28 crc kubenswrapper[4696]: I0318 16:44:28.618200 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c57b8a37-6ba6-4952-b4fd-3937690b4376-catalog-content\") pod \"c57b8a37-6ba6-4952-b4fd-3937690b4376\" (UID: \"c57b8a37-6ba6-4952-b4fd-3937690b4376\") " Mar 18 16:44:28 crc kubenswrapper[4696]: I0318 16:44:28.618255 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tbfj\" (UniqueName: \"kubernetes.io/projected/c57b8a37-6ba6-4952-b4fd-3937690b4376-kube-api-access-9tbfj\") pod \"c57b8a37-6ba6-4952-b4fd-3937690b4376\" (UID: \"c57b8a37-6ba6-4952-b4fd-3937690b4376\") " Mar 18 16:44:28 crc kubenswrapper[4696]: I0318 16:44:28.618355 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c57b8a37-6ba6-4952-b4fd-3937690b4376-utilities\") pod \"c57b8a37-6ba6-4952-b4fd-3937690b4376\" (UID: \"c57b8a37-6ba6-4952-b4fd-3937690b4376\") " Mar 18 16:44:28 crc kubenswrapper[4696]: I0318 16:44:28.619252 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c57b8a37-6ba6-4952-b4fd-3937690b4376-utilities" (OuterVolumeSpecName: "utilities") pod "c57b8a37-6ba6-4952-b4fd-3937690b4376" (UID: "c57b8a37-6ba6-4952-b4fd-3937690b4376"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:44:28 crc kubenswrapper[4696]: I0318 16:44:28.625998 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c57b8a37-6ba6-4952-b4fd-3937690b4376-kube-api-access-9tbfj" (OuterVolumeSpecName: "kube-api-access-9tbfj") pod "c57b8a37-6ba6-4952-b4fd-3937690b4376" (UID: "c57b8a37-6ba6-4952-b4fd-3937690b4376"). InnerVolumeSpecName "kube-api-access-9tbfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:44:28 crc kubenswrapper[4696]: I0318 16:44:28.680843 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c57b8a37-6ba6-4952-b4fd-3937690b4376-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c57b8a37-6ba6-4952-b4fd-3937690b4376" (UID: "c57b8a37-6ba6-4952-b4fd-3937690b4376"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:44:28 crc kubenswrapper[4696]: I0318 16:44:28.721098 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c57b8a37-6ba6-4952-b4fd-3937690b4376-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:44:28 crc kubenswrapper[4696]: I0318 16:44:28.721133 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tbfj\" (UniqueName: \"kubernetes.io/projected/c57b8a37-6ba6-4952-b4fd-3937690b4376-kube-api-access-9tbfj\") on node \"crc\" DevicePath \"\"" Mar 18 16:44:28 crc kubenswrapper[4696]: I0318 16:44:28.721146 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c57b8a37-6ba6-4952-b4fd-3937690b4376-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:44:29 crc kubenswrapper[4696]: I0318 16:44:29.080242 4696 generic.go:334] "Generic (PLEG): container finished" podID="c57b8a37-6ba6-4952-b4fd-3937690b4376" containerID="aa47e08c86b0b9c3a9fce054d37bdc66557d36e8a0e8c73c8465e12291801e88" exitCode=0 Mar 18 16:44:29 crc kubenswrapper[4696]: I0318 16:44:29.080346 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qgh2v" Mar 18 16:44:29 crc kubenswrapper[4696]: I0318 16:44:29.080412 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qgh2v" event={"ID":"c57b8a37-6ba6-4952-b4fd-3937690b4376","Type":"ContainerDied","Data":"aa47e08c86b0b9c3a9fce054d37bdc66557d36e8a0e8c73c8465e12291801e88"} Mar 18 16:44:29 crc kubenswrapper[4696]: I0318 16:44:29.082606 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qgh2v" event={"ID":"c57b8a37-6ba6-4952-b4fd-3937690b4376","Type":"ContainerDied","Data":"2c67fbc9e0427385b5b684349bc09abc4ab5f89e040da24496d90b3b073c293f"} Mar 18 16:44:29 crc kubenswrapper[4696]: I0318 16:44:29.082639 4696 scope.go:117] "RemoveContainer" containerID="aa47e08c86b0b9c3a9fce054d37bdc66557d36e8a0e8c73c8465e12291801e88" Mar 18 16:44:29 crc kubenswrapper[4696]: I0318 16:44:29.087931 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2zgrk/must-gather-cm5bk" event={"ID":"b7b100a7-0df9-496f-bb44-424109bd8c96","Type":"ContainerStarted","Data":"09b8c5b416289d37493dc15489994489abc85f8ed47bcd45654ea4ca6ac08bcb"} Mar 18 16:44:29 crc kubenswrapper[4696]: I0318 16:44:29.087979 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2zgrk/must-gather-cm5bk" event={"ID":"b7b100a7-0df9-496f-bb44-424109bd8c96","Type":"ContainerStarted","Data":"1039eb63d272641f85c38d2bea4edc2806860f49852d47de8c5b8a40ca786a58"} Mar 18 16:44:29 crc kubenswrapper[4696]: I0318 16:44:29.087994 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2zgrk/must-gather-cm5bk" event={"ID":"b7b100a7-0df9-496f-bb44-424109bd8c96","Type":"ContainerStarted","Data":"5f861ffd4245b00c89fe53f18f5ad7e67f2c12fcab69f0c5470ecee5a32bc827"} Mar 18 16:44:29 crc kubenswrapper[4696]: I0318 16:44:29.121642 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2zgrk/must-gather-cm5bk" podStartSLOduration=2.121625301 podStartE2EDuration="2.121625301s" podCreationTimestamp="2026-03-18 16:44:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:44:29.107832584 +0000 UTC m=+4112.114006800" watchObservedRunningTime="2026-03-18 16:44:29.121625301 +0000 UTC m=+4112.127799507" Mar 18 16:44:29 crc kubenswrapper[4696]: I0318 16:44:29.127922 4696 scope.go:117] "RemoveContainer" containerID="614e6218be0a5bfe278ee0625013da82e9c46243ea5267091996c0584dc29cfc" Mar 18 16:44:29 crc kubenswrapper[4696]: I0318 16:44:29.135784 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qgh2v"] Mar 18 16:44:29 crc kubenswrapper[4696]: I0318 16:44:29.144156 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qgh2v"] Mar 18 16:44:29 crc kubenswrapper[4696]: I0318 16:44:29.156174 4696 scope.go:117] "RemoveContainer" containerID="22d230f12abf023daeb0fe11ccf44bebd45f23a77e028ab61460a1e94bc5ce3c" Mar 18 16:44:29 crc kubenswrapper[4696]: I0318 16:44:29.201969 4696 scope.go:117] "RemoveContainer" containerID="aa47e08c86b0b9c3a9fce054d37bdc66557d36e8a0e8c73c8465e12291801e88" Mar 18 16:44:29 crc kubenswrapper[4696]: E0318 16:44:29.202473 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa47e08c86b0b9c3a9fce054d37bdc66557d36e8a0e8c73c8465e12291801e88\": container with ID starting with aa47e08c86b0b9c3a9fce054d37bdc66557d36e8a0e8c73c8465e12291801e88 not found: ID does not exist" containerID="aa47e08c86b0b9c3a9fce054d37bdc66557d36e8a0e8c73c8465e12291801e88" Mar 18 16:44:29 crc kubenswrapper[4696]: I0318 16:44:29.202540 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa47e08c86b0b9c3a9fce054d37bdc66557d36e8a0e8c73c8465e12291801e88"} err="failed to get container status \"aa47e08c86b0b9c3a9fce054d37bdc66557d36e8a0e8c73c8465e12291801e88\": rpc error: code = NotFound desc = could not find container \"aa47e08c86b0b9c3a9fce054d37bdc66557d36e8a0e8c73c8465e12291801e88\": container with ID starting with aa47e08c86b0b9c3a9fce054d37bdc66557d36e8a0e8c73c8465e12291801e88 not found: ID does not exist" Mar 18 16:44:29 crc kubenswrapper[4696]: I0318 16:44:29.202572 4696 scope.go:117] "RemoveContainer" containerID="614e6218be0a5bfe278ee0625013da82e9c46243ea5267091996c0584dc29cfc" Mar 18 16:44:29 crc kubenswrapper[4696]: E0318 16:44:29.202967 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"614e6218be0a5bfe278ee0625013da82e9c46243ea5267091996c0584dc29cfc\": container with ID starting with 614e6218be0a5bfe278ee0625013da82e9c46243ea5267091996c0584dc29cfc not found: ID does not exist" containerID="614e6218be0a5bfe278ee0625013da82e9c46243ea5267091996c0584dc29cfc" Mar 18 16:44:29 crc kubenswrapper[4696]: I0318 16:44:29.203096 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"614e6218be0a5bfe278ee0625013da82e9c46243ea5267091996c0584dc29cfc"} err="failed to get container status \"614e6218be0a5bfe278ee0625013da82e9c46243ea5267091996c0584dc29cfc\": rpc error: code = NotFound desc = could not find container \"614e6218be0a5bfe278ee0625013da82e9c46243ea5267091996c0584dc29cfc\": container with ID starting with 614e6218be0a5bfe278ee0625013da82e9c46243ea5267091996c0584dc29cfc not found: ID does not exist" Mar 18 16:44:29 crc kubenswrapper[4696]: I0318 16:44:29.203201 4696 scope.go:117] "RemoveContainer" containerID="22d230f12abf023daeb0fe11ccf44bebd45f23a77e028ab61460a1e94bc5ce3c" Mar 18 16:44:29 crc kubenswrapper[4696]: E0318 16:44:29.203596 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22d230f12abf023daeb0fe11ccf44bebd45f23a77e028ab61460a1e94bc5ce3c\": container with ID starting with 22d230f12abf023daeb0fe11ccf44bebd45f23a77e028ab61460a1e94bc5ce3c not found: ID does not exist" containerID="22d230f12abf023daeb0fe11ccf44bebd45f23a77e028ab61460a1e94bc5ce3c" Mar 18 16:44:29 crc kubenswrapper[4696]: I0318 16:44:29.203624 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22d230f12abf023daeb0fe11ccf44bebd45f23a77e028ab61460a1e94bc5ce3c"} err="failed to get container status \"22d230f12abf023daeb0fe11ccf44bebd45f23a77e028ab61460a1e94bc5ce3c\": rpc error: code = NotFound desc = could not find container \"22d230f12abf023daeb0fe11ccf44bebd45f23a77e028ab61460a1e94bc5ce3c\": container with ID starting with 22d230f12abf023daeb0fe11ccf44bebd45f23a77e028ab61460a1e94bc5ce3c not found: ID does not exist" Mar 18 16:44:29 crc kubenswrapper[4696]: I0318 16:44:29.610608 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c57b8a37-6ba6-4952-b4fd-3937690b4376" path="/var/lib/kubelet/pods/c57b8a37-6ba6-4952-b4fd-3937690b4376/volumes" Mar 18 16:44:31 crc kubenswrapper[4696]: I0318 16:44:31.438444 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ktt9t"] Mar 18 16:44:31 crc kubenswrapper[4696]: E0318 16:44:31.443102 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c57b8a37-6ba6-4952-b4fd-3937690b4376" containerName="extract-content" Mar 18 16:44:31 crc kubenswrapper[4696]: I0318 16:44:31.443128 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c57b8a37-6ba6-4952-b4fd-3937690b4376" containerName="extract-content" Mar 18 16:44:31 crc kubenswrapper[4696]: E0318 16:44:31.443159 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c57b8a37-6ba6-4952-b4fd-3937690b4376" containerName="registry-server" Mar 18 16:44:31 crc kubenswrapper[4696]: I0318 16:44:31.443166 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c57b8a37-6ba6-4952-b4fd-3937690b4376" containerName="registry-server" Mar 18 16:44:31 crc kubenswrapper[4696]: E0318 16:44:31.443184 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c57b8a37-6ba6-4952-b4fd-3937690b4376" containerName="extract-utilities" Mar 18 16:44:31 crc kubenswrapper[4696]: I0318 16:44:31.443192 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c57b8a37-6ba6-4952-b4fd-3937690b4376" containerName="extract-utilities" Mar 18 16:44:31 crc kubenswrapper[4696]: I0318 16:44:31.443446 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c57b8a37-6ba6-4952-b4fd-3937690b4376" containerName="registry-server" Mar 18 16:44:31 crc kubenswrapper[4696]: I0318 16:44:31.445148 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ktt9t" Mar 18 16:44:31 crc kubenswrapper[4696]: I0318 16:44:31.452811 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ktt9t"] Mar 18 16:44:31 crc kubenswrapper[4696]: I0318 16:44:31.606919 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1-utilities\") pod \"community-operators-ktt9t\" (UID: \"3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1\") " pod="openshift-marketplace/community-operators-ktt9t" Mar 18 16:44:31 crc kubenswrapper[4696]: I0318 16:44:31.607758 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvj2h\" (UniqueName: \"kubernetes.io/projected/3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1-kube-api-access-kvj2h\") pod \"community-operators-ktt9t\" (UID: \"3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1\") " pod="openshift-marketplace/community-operators-ktt9t" Mar 18 16:44:31 crc kubenswrapper[4696]: I0318 16:44:31.608106 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1-catalog-content\") pod \"community-operators-ktt9t\" (UID: \"3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1\") " pod="openshift-marketplace/community-operators-ktt9t" Mar 18 16:44:31 crc kubenswrapper[4696]: I0318 16:44:31.710748 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1-utilities\") pod \"community-operators-ktt9t\" (UID: \"3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1\") " pod="openshift-marketplace/community-operators-ktt9t" Mar 18 16:44:31 crc kubenswrapper[4696]: I0318 16:44:31.710844 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvj2h\" (UniqueName: \"kubernetes.io/projected/3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1-kube-api-access-kvj2h\") pod \"community-operators-ktt9t\" (UID: \"3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1\") " pod="openshift-marketplace/community-operators-ktt9t" Mar 18 16:44:31 crc kubenswrapper[4696]: I0318 16:44:31.710920 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1-catalog-content\") pod \"community-operators-ktt9t\" (UID: \"3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1\") " pod="openshift-marketplace/community-operators-ktt9t" Mar 18 16:44:31 crc kubenswrapper[4696]: I0318 16:44:31.711470 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1-catalog-content\") pod \"community-operators-ktt9t\" (UID: \"3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1\") " pod="openshift-marketplace/community-operators-ktt9t" Mar 18 16:44:31 crc kubenswrapper[4696]: I0318 16:44:31.711779 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1-utilities\") pod \"community-operators-ktt9t\" (UID: \"3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1\") " pod="openshift-marketplace/community-operators-ktt9t" Mar 18 16:44:31 crc kubenswrapper[4696]: I0318 16:44:31.737777 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvj2h\" (UniqueName: \"kubernetes.io/projected/3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1-kube-api-access-kvj2h\") pod \"community-operators-ktt9t\" (UID: \"3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1\") " pod="openshift-marketplace/community-operators-ktt9t" Mar 18 16:44:31 crc kubenswrapper[4696]: I0318 16:44:31.780796 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ktt9t" Mar 18 16:44:32 crc kubenswrapper[4696]: I0318 16:44:32.312912 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ktt9t"] Mar 18 16:44:33 crc kubenswrapper[4696]: I0318 16:44:33.154307 4696 generic.go:334] "Generic (PLEG): container finished" podID="3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1" containerID="0ab444e695c2e7a1f85768ccb1c39563c784a184522e44713ff995c36a89a940" exitCode=0 Mar 18 16:44:33 crc kubenswrapper[4696]: I0318 16:44:33.154418 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktt9t" event={"ID":"3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1","Type":"ContainerDied","Data":"0ab444e695c2e7a1f85768ccb1c39563c784a184522e44713ff995c36a89a940"} Mar 18 16:44:33 crc kubenswrapper[4696]: I0318 16:44:33.154925 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktt9t" event={"ID":"3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1","Type":"ContainerStarted","Data":"35e24be5b4cabb5a6fae4b9c56a774009397f71f4e128e46a344a516402e66ff"} Mar 18 16:44:33 crc kubenswrapper[4696]: I0318 16:44:33.454974 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2zgrk/crc-debug-26zmk"] Mar 18 16:44:33 crc kubenswrapper[4696]: I0318 16:44:33.456179 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2zgrk/crc-debug-26zmk" Mar 18 16:44:33 crc kubenswrapper[4696]: I0318 16:44:33.462395 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-2zgrk"/"default-dockercfg-nv254" Mar 18 16:44:33 crc kubenswrapper[4696]: I0318 16:44:33.549451 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d999r\" (UniqueName: \"kubernetes.io/projected/b0c65f77-ebda-4336-8916-05a15e2fb2a2-kube-api-access-d999r\") pod \"crc-debug-26zmk\" (UID: \"b0c65f77-ebda-4336-8916-05a15e2fb2a2\") " pod="openshift-must-gather-2zgrk/crc-debug-26zmk" Mar 18 16:44:33 crc kubenswrapper[4696]: I0318 16:44:33.549577 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0c65f77-ebda-4336-8916-05a15e2fb2a2-host\") pod \"crc-debug-26zmk\" (UID: \"b0c65f77-ebda-4336-8916-05a15e2fb2a2\") " pod="openshift-must-gather-2zgrk/crc-debug-26zmk" Mar 18 16:44:33 crc kubenswrapper[4696]: I0318 16:44:33.652857 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d999r\" (UniqueName: \"kubernetes.io/projected/b0c65f77-ebda-4336-8916-05a15e2fb2a2-kube-api-access-d999r\") pod \"crc-debug-26zmk\" (UID: \"b0c65f77-ebda-4336-8916-05a15e2fb2a2\") " pod="openshift-must-gather-2zgrk/crc-debug-26zmk" Mar 18 16:44:33 crc kubenswrapper[4696]: I0318 16:44:33.653148 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0c65f77-ebda-4336-8916-05a15e2fb2a2-host\") pod \"crc-debug-26zmk\" (UID: \"b0c65f77-ebda-4336-8916-05a15e2fb2a2\") " pod="openshift-must-gather-2zgrk/crc-debug-26zmk" Mar 18 16:44:33 crc kubenswrapper[4696]: I0318 16:44:33.654279 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0c65f77-ebda-4336-8916-05a15e2fb2a2-host\") pod \"crc-debug-26zmk\" (UID: \"b0c65f77-ebda-4336-8916-05a15e2fb2a2\") " pod="openshift-must-gather-2zgrk/crc-debug-26zmk" Mar 18 16:44:33 crc kubenswrapper[4696]: I0318 16:44:33.688791 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d999r\" (UniqueName: \"kubernetes.io/projected/b0c65f77-ebda-4336-8916-05a15e2fb2a2-kube-api-access-d999r\") pod \"crc-debug-26zmk\" (UID: \"b0c65f77-ebda-4336-8916-05a15e2fb2a2\") " pod="openshift-must-gather-2zgrk/crc-debug-26zmk" Mar 18 16:44:33 crc kubenswrapper[4696]: I0318 16:44:33.776704 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2zgrk/crc-debug-26zmk" Mar 18 16:44:34 crc kubenswrapper[4696]: I0318 16:44:34.165079 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2zgrk/crc-debug-26zmk" event={"ID":"b0c65f77-ebda-4336-8916-05a15e2fb2a2","Type":"ContainerStarted","Data":"50da90607e61a5fee728f97c7be6fa03f2a20c73c494a2ceed6b3e7df2338e54"} Mar 18 16:44:34 crc kubenswrapper[4696]: I0318 16:44:34.165378 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2zgrk/crc-debug-26zmk" event={"ID":"b0c65f77-ebda-4336-8916-05a15e2fb2a2","Type":"ContainerStarted","Data":"6c29eccf29cceb412fca4d693068725bc859d2fdd6c15ae513e5c437c61fe0b1"} Mar 18 16:44:34 crc kubenswrapper[4696]: I0318 16:44:34.187185 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2zgrk/crc-debug-26zmk" podStartSLOduration=1.187164544 podStartE2EDuration="1.187164544s" podCreationTimestamp="2026-03-18 16:44:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:44:34.178593708 +0000 UTC m=+4117.184767924" watchObservedRunningTime="2026-03-18 16:44:34.187164544 +0000 UTC m=+4117.193338750" Mar 18 16:44:35 crc kubenswrapper[4696]: I0318 16:44:35.175221 4696 generic.go:334] "Generic (PLEG): container finished" podID="3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1" containerID="4a1c3734e42fd84c63c70758c8479527e4ff7842e75ce8e92b383e49b0c76bea" exitCode=0 Mar 18 16:44:35 crc kubenswrapper[4696]: I0318 16:44:35.175308 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktt9t" event={"ID":"3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1","Type":"ContainerDied","Data":"4a1c3734e42fd84c63c70758c8479527e4ff7842e75ce8e92b383e49b0c76bea"} Mar 18 16:44:36 crc kubenswrapper[4696]: I0318 16:44:36.187025 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktt9t" event={"ID":"3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1","Type":"ContainerStarted","Data":"68d519aafd8b638d0cbee10a850e667cd9267e9c4de8c1f2080144334acd75e8"} Mar 18 16:44:36 crc kubenswrapper[4696]: I0318 16:44:36.211577 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ktt9t" podStartSLOduration=2.644230655 podStartE2EDuration="5.211503654s" podCreationTimestamp="2026-03-18 16:44:31 +0000 UTC" firstStartedPulling="2026-03-18 16:44:33.156577186 +0000 UTC m=+4116.162751392" lastFinishedPulling="2026-03-18 16:44:35.723850185 +0000 UTC m=+4118.730024391" observedRunningTime="2026-03-18 16:44:36.202819566 +0000 UTC m=+4119.208993772" watchObservedRunningTime="2026-03-18 16:44:36.211503654 +0000 UTC m=+4119.217677860" Mar 18 16:44:41 crc kubenswrapper[4696]: I0318 16:44:41.781126 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ktt9t" Mar 18 16:44:41 crc kubenswrapper[4696]: I0318 16:44:41.782701 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ktt9t" Mar 18 16:44:42 crc kubenswrapper[4696]: I0318 16:44:42.220736 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ktt9t" Mar 18 16:44:42 crc kubenswrapper[4696]: I0318 16:44:42.295761 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ktt9t" Mar 18 16:44:42 crc kubenswrapper[4696]: I0318 16:44:42.468699 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ktt9t"] Mar 18 16:44:44 crc kubenswrapper[4696]: I0318 16:44:44.254402 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ktt9t" podUID="3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1" containerName="registry-server" containerID="cri-o://68d519aafd8b638d0cbee10a850e667cd9267e9c4de8c1f2080144334acd75e8" gracePeriod=2 Mar 18 16:44:44 crc kubenswrapper[4696]: I0318 16:44:44.709004 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ktt9t" Mar 18 16:44:44 crc kubenswrapper[4696]: I0318 16:44:44.759458 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvj2h\" (UniqueName: \"kubernetes.io/projected/3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1-kube-api-access-kvj2h\") pod \"3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1\" (UID: \"3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1\") " Mar 18 16:44:44 crc kubenswrapper[4696]: I0318 16:44:44.759642 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1-catalog-content\") pod \"3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1\" (UID: \"3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1\") " Mar 18 16:44:44 crc kubenswrapper[4696]: I0318 16:44:44.759751 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1-utilities\") pod \"3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1\" (UID: \"3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1\") " Mar 18 16:44:44 crc kubenswrapper[4696]: I0318 16:44:44.761320 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1-utilities" (OuterVolumeSpecName: "utilities") pod "3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1" (UID: "3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:44:44 crc kubenswrapper[4696]: I0318 16:44:44.771323 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1-kube-api-access-kvj2h" (OuterVolumeSpecName: "kube-api-access-kvj2h") pod "3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1" (UID: "3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1"). InnerVolumeSpecName "kube-api-access-kvj2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:44:44 crc kubenswrapper[4696]: I0318 16:44:44.862137 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvj2h\" (UniqueName: \"kubernetes.io/projected/3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1-kube-api-access-kvj2h\") on node \"crc\" DevicePath \"\"" Mar 18 16:44:44 crc kubenswrapper[4696]: I0318 16:44:44.862171 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:44:45 crc kubenswrapper[4696]: I0318 16:44:45.264019 4696 generic.go:334] "Generic (PLEG): container finished" podID="3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1" containerID="68d519aafd8b638d0cbee10a850e667cd9267e9c4de8c1f2080144334acd75e8" exitCode=0 Mar 18 16:44:45 crc kubenswrapper[4696]: I0318 16:44:45.264073 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ktt9t" Mar 18 16:44:45 crc kubenswrapper[4696]: I0318 16:44:45.264107 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktt9t" event={"ID":"3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1","Type":"ContainerDied","Data":"68d519aafd8b638d0cbee10a850e667cd9267e9c4de8c1f2080144334acd75e8"} Mar 18 16:44:45 crc kubenswrapper[4696]: I0318 16:44:45.264504 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktt9t" event={"ID":"3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1","Type":"ContainerDied","Data":"35e24be5b4cabb5a6fae4b9c56a774009397f71f4e128e46a344a516402e66ff"} Mar 18 16:44:45 crc kubenswrapper[4696]: I0318 16:44:45.264544 4696 scope.go:117] "RemoveContainer" containerID="68d519aafd8b638d0cbee10a850e667cd9267e9c4de8c1f2080144334acd75e8" Mar 18 16:44:45 crc kubenswrapper[4696]: I0318 16:44:45.281942 4696 scope.go:117] "RemoveContainer" containerID="4a1c3734e42fd84c63c70758c8479527e4ff7842e75ce8e92b383e49b0c76bea" Mar 18 16:44:45 crc kubenswrapper[4696]: I0318 16:44:45.306217 4696 scope.go:117] "RemoveContainer" containerID="0ab444e695c2e7a1f85768ccb1c39563c784a184522e44713ff995c36a89a940" Mar 18 16:44:45 crc kubenswrapper[4696]: I0318 16:44:45.366492 4696 scope.go:117] "RemoveContainer" containerID="68d519aafd8b638d0cbee10a850e667cd9267e9c4de8c1f2080144334acd75e8" Mar 18 16:44:45 crc kubenswrapper[4696]: E0318 16:44:45.367303 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68d519aafd8b638d0cbee10a850e667cd9267e9c4de8c1f2080144334acd75e8\": container with ID starting with 68d519aafd8b638d0cbee10a850e667cd9267e9c4de8c1f2080144334acd75e8 not found: ID does not exist" containerID="68d519aafd8b638d0cbee10a850e667cd9267e9c4de8c1f2080144334acd75e8" Mar 18 16:44:45 crc kubenswrapper[4696]: I0318 16:44:45.367354 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68d519aafd8b638d0cbee10a850e667cd9267e9c4de8c1f2080144334acd75e8"} err="failed to get container status \"68d519aafd8b638d0cbee10a850e667cd9267e9c4de8c1f2080144334acd75e8\": rpc error: code = NotFound desc = could not find container \"68d519aafd8b638d0cbee10a850e667cd9267e9c4de8c1f2080144334acd75e8\": container with ID starting with 68d519aafd8b638d0cbee10a850e667cd9267e9c4de8c1f2080144334acd75e8 not found: ID does not exist" Mar 18 16:44:45 crc kubenswrapper[4696]: I0318 16:44:45.367393 4696 scope.go:117] "RemoveContainer" containerID="4a1c3734e42fd84c63c70758c8479527e4ff7842e75ce8e92b383e49b0c76bea" Mar 18 16:44:45 crc kubenswrapper[4696]: E0318 16:44:45.367808 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a1c3734e42fd84c63c70758c8479527e4ff7842e75ce8e92b383e49b0c76bea\": container with ID starting with 4a1c3734e42fd84c63c70758c8479527e4ff7842e75ce8e92b383e49b0c76bea not found: ID does not exist" containerID="4a1c3734e42fd84c63c70758c8479527e4ff7842e75ce8e92b383e49b0c76bea" Mar 18 16:44:45 crc kubenswrapper[4696]: I0318 16:44:45.367958 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a1c3734e42fd84c63c70758c8479527e4ff7842e75ce8e92b383e49b0c76bea"} err="failed to get container status \"4a1c3734e42fd84c63c70758c8479527e4ff7842e75ce8e92b383e49b0c76bea\": rpc error: code = NotFound desc = could not find container \"4a1c3734e42fd84c63c70758c8479527e4ff7842e75ce8e92b383e49b0c76bea\": container with ID starting with 4a1c3734e42fd84c63c70758c8479527e4ff7842e75ce8e92b383e49b0c76bea not found: ID does not exist" Mar 18 16:44:45 crc kubenswrapper[4696]: I0318 16:44:45.368063 4696 scope.go:117] "RemoveContainer" containerID="0ab444e695c2e7a1f85768ccb1c39563c784a184522e44713ff995c36a89a940" Mar 18 16:44:45 crc kubenswrapper[4696]: E0318 16:44:45.368376 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ab444e695c2e7a1f85768ccb1c39563c784a184522e44713ff995c36a89a940\": container with ID starting with 0ab444e695c2e7a1f85768ccb1c39563c784a184522e44713ff995c36a89a940 not found: ID does not exist" containerID="0ab444e695c2e7a1f85768ccb1c39563c784a184522e44713ff995c36a89a940" Mar 18 16:44:45 crc kubenswrapper[4696]: I0318 16:44:45.368531 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ab444e695c2e7a1f85768ccb1c39563c784a184522e44713ff995c36a89a940"} err="failed to get container status \"0ab444e695c2e7a1f85768ccb1c39563c784a184522e44713ff995c36a89a940\": rpc error: code = NotFound desc = could not find container \"0ab444e695c2e7a1f85768ccb1c39563c784a184522e44713ff995c36a89a940\": container with ID starting with 0ab444e695c2e7a1f85768ccb1c39563c784a184522e44713ff995c36a89a940 not found: ID does not exist" Mar 18 16:44:45 crc kubenswrapper[4696]: I0318 16:44:45.377363 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1" (UID: "3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:44:45 crc kubenswrapper[4696]: I0318 16:44:45.475359 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:44:45 crc kubenswrapper[4696]: I0318 16:44:45.606960 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ktt9t"] Mar 18 16:44:45 crc kubenswrapper[4696]: I0318 16:44:45.607366 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ktt9t"] Mar 18 16:44:47 crc kubenswrapper[4696]: I0318 16:44:47.608825 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1" path="/var/lib/kubelet/pods/3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1/volumes" Mar 18 16:45:00 crc kubenswrapper[4696]: I0318 16:45:00.155184 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564205-ns4nz"] Mar 18 16:45:00 crc kubenswrapper[4696]: E0318 16:45:00.156118 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1" containerName="registry-server" Mar 18 16:45:00 crc kubenswrapper[4696]: I0318 16:45:00.156133 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1" containerName="registry-server" Mar 18 16:45:00 crc kubenswrapper[4696]: E0318 16:45:00.156152 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1" containerName="extract-content" Mar 18 16:45:00 crc kubenswrapper[4696]: I0318 16:45:00.156158 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1" containerName="extract-content" Mar 18 16:45:00 crc kubenswrapper[4696]: E0318 16:45:00.156172 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1" containerName="extract-utilities" Mar 18 16:45:00 crc kubenswrapper[4696]: I0318 16:45:00.156179 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1" containerName="extract-utilities" Mar 18 16:45:00 crc kubenswrapper[4696]: I0318 16:45:00.156370 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d43cd2e-fdf5-4df2-ac87-0815ddcbeef1" containerName="registry-server" Mar 18 16:45:00 crc kubenswrapper[4696]: I0318 16:45:00.157040 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-ns4nz" Mar 18 16:45:00 crc kubenswrapper[4696]: I0318 16:45:00.160171 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 16:45:00 crc kubenswrapper[4696]: I0318 16:45:00.162106 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 16:45:00 crc kubenswrapper[4696]: I0318 16:45:00.177988 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564205-ns4nz"] Mar 18 16:45:00 crc kubenswrapper[4696]: I0318 16:45:00.249269 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62ghp\" (UniqueName: \"kubernetes.io/projected/aedebe4e-9215-42fe-a9ac-4058d047d5c0-kube-api-access-62ghp\") pod \"collect-profiles-29564205-ns4nz\" (UID: \"aedebe4e-9215-42fe-a9ac-4058d047d5c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-ns4nz" Mar 18 16:45:00 crc kubenswrapper[4696]: I0318 16:45:00.249583 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aedebe4e-9215-42fe-a9ac-4058d047d5c0-secret-volume\") pod \"collect-profiles-29564205-ns4nz\" (UID: \"aedebe4e-9215-42fe-a9ac-4058d047d5c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-ns4nz" Mar 18 16:45:00 crc kubenswrapper[4696]: I0318 16:45:00.249657 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aedebe4e-9215-42fe-a9ac-4058d047d5c0-config-volume\") pod \"collect-profiles-29564205-ns4nz\" (UID: \"aedebe4e-9215-42fe-a9ac-4058d047d5c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-ns4nz" Mar 18 16:45:00 crc kubenswrapper[4696]: I0318 16:45:00.351642 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aedebe4e-9215-42fe-a9ac-4058d047d5c0-config-volume\") pod \"collect-profiles-29564205-ns4nz\" (UID: \"aedebe4e-9215-42fe-a9ac-4058d047d5c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-ns4nz" Mar 18 16:45:00 crc kubenswrapper[4696]: I0318 16:45:00.352079 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62ghp\" (UniqueName: \"kubernetes.io/projected/aedebe4e-9215-42fe-a9ac-4058d047d5c0-kube-api-access-62ghp\") pod \"collect-profiles-29564205-ns4nz\" (UID: \"aedebe4e-9215-42fe-a9ac-4058d047d5c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-ns4nz" Mar 18 16:45:00 crc kubenswrapper[4696]: I0318 16:45:00.352202 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aedebe4e-9215-42fe-a9ac-4058d047d5c0-secret-volume\") pod \"collect-profiles-29564205-ns4nz\" (UID: \"aedebe4e-9215-42fe-a9ac-4058d047d5c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-ns4nz" Mar 18 16:45:00 crc kubenswrapper[4696]: I0318 16:45:00.352728 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aedebe4e-9215-42fe-a9ac-4058d047d5c0-config-volume\") pod \"collect-profiles-29564205-ns4nz\" (UID: \"aedebe4e-9215-42fe-a9ac-4058d047d5c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-ns4nz" Mar 18 16:45:00 crc kubenswrapper[4696]: I0318 16:45:00.751378 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aedebe4e-9215-42fe-a9ac-4058d047d5c0-secret-volume\") pod \"collect-profiles-29564205-ns4nz\" (UID: \"aedebe4e-9215-42fe-a9ac-4058d047d5c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-ns4nz" Mar 18 16:45:00 crc kubenswrapper[4696]: I0318 16:45:00.761956 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62ghp\" (UniqueName: \"kubernetes.io/projected/aedebe4e-9215-42fe-a9ac-4058d047d5c0-kube-api-access-62ghp\") pod \"collect-profiles-29564205-ns4nz\" (UID: \"aedebe4e-9215-42fe-a9ac-4058d047d5c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-ns4nz" Mar 18 16:45:00 crc kubenswrapper[4696]: I0318 16:45:00.778559 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-ns4nz" Mar 18 16:45:01 crc kubenswrapper[4696]: I0318 16:45:01.332652 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564205-ns4nz"] Mar 18 16:45:01 crc kubenswrapper[4696]: I0318 16:45:01.420730 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-ns4nz" event={"ID":"aedebe4e-9215-42fe-a9ac-4058d047d5c0","Type":"ContainerStarted","Data":"4d1ff25a07740f599542f8a6596c385f1a657eb7e69aff350c207365f0f10879"} Mar 18 16:45:02 crc kubenswrapper[4696]: I0318 16:45:02.429777 4696 generic.go:334] "Generic (PLEG): container finished" podID="aedebe4e-9215-42fe-a9ac-4058d047d5c0" containerID="549dea1a57141a5b653045e4049f837fc6585a1f507c51da98a66833ff046b2d" exitCode=0 Mar 18 16:45:02 crc kubenswrapper[4696]: I0318 16:45:02.429822 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-ns4nz" event={"ID":"aedebe4e-9215-42fe-a9ac-4058d047d5c0","Type":"ContainerDied","Data":"549dea1a57141a5b653045e4049f837fc6585a1f507c51da98a66833ff046b2d"} Mar 18 16:45:03 crc kubenswrapper[4696]: I0318 16:45:03.797008 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-ns4nz" Mar 18 16:45:03 crc kubenswrapper[4696]: I0318 16:45:03.929634 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62ghp\" (UniqueName: \"kubernetes.io/projected/aedebe4e-9215-42fe-a9ac-4058d047d5c0-kube-api-access-62ghp\") pod \"aedebe4e-9215-42fe-a9ac-4058d047d5c0\" (UID: \"aedebe4e-9215-42fe-a9ac-4058d047d5c0\") " Mar 18 16:45:03 crc kubenswrapper[4696]: I0318 16:45:03.930149 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aedebe4e-9215-42fe-a9ac-4058d047d5c0-secret-volume\") pod \"aedebe4e-9215-42fe-a9ac-4058d047d5c0\" (UID: \"aedebe4e-9215-42fe-a9ac-4058d047d5c0\") " Mar 18 16:45:03 crc kubenswrapper[4696]: I0318 16:45:03.930324 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aedebe4e-9215-42fe-a9ac-4058d047d5c0-config-volume\") pod \"aedebe4e-9215-42fe-a9ac-4058d047d5c0\" (UID: \"aedebe4e-9215-42fe-a9ac-4058d047d5c0\") " Mar 18 16:45:03 crc kubenswrapper[4696]: I0318 16:45:03.930890 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aedebe4e-9215-42fe-a9ac-4058d047d5c0-config-volume" (OuterVolumeSpecName: "config-volume") pod "aedebe4e-9215-42fe-a9ac-4058d047d5c0" (UID: "aedebe4e-9215-42fe-a9ac-4058d047d5c0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:45:03 crc kubenswrapper[4696]: I0318 16:45:03.931363 4696 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aedebe4e-9215-42fe-a9ac-4058d047d5c0-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 16:45:03 crc kubenswrapper[4696]: I0318 16:45:03.947727 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aedebe4e-9215-42fe-a9ac-4058d047d5c0-kube-api-access-62ghp" (OuterVolumeSpecName: "kube-api-access-62ghp") pod "aedebe4e-9215-42fe-a9ac-4058d047d5c0" (UID: "aedebe4e-9215-42fe-a9ac-4058d047d5c0"). InnerVolumeSpecName "kube-api-access-62ghp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:45:03 crc kubenswrapper[4696]: I0318 16:45:03.952771 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aedebe4e-9215-42fe-a9ac-4058d047d5c0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "aedebe4e-9215-42fe-a9ac-4058d047d5c0" (UID: "aedebe4e-9215-42fe-a9ac-4058d047d5c0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:45:04 crc kubenswrapper[4696]: I0318 16:45:04.032875 4696 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aedebe4e-9215-42fe-a9ac-4058d047d5c0-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 16:45:04 crc kubenswrapper[4696]: I0318 16:45:04.032926 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62ghp\" (UniqueName: \"kubernetes.io/projected/aedebe4e-9215-42fe-a9ac-4058d047d5c0-kube-api-access-62ghp\") on node \"crc\" DevicePath \"\"" Mar 18 16:45:04 crc kubenswrapper[4696]: I0318 16:45:04.449048 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-ns4nz" event={"ID":"aedebe4e-9215-42fe-a9ac-4058d047d5c0","Type":"ContainerDied","Data":"4d1ff25a07740f599542f8a6596c385f1a657eb7e69aff350c207365f0f10879"} Mar 18 16:45:04 crc kubenswrapper[4696]: I0318 16:45:04.449082 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d1ff25a07740f599542f8a6596c385f1a657eb7e69aff350c207365f0f10879" Mar 18 16:45:04 crc kubenswrapper[4696]: I0318 16:45:04.449146 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-ns4nz" Mar 18 16:45:04 crc kubenswrapper[4696]: I0318 16:45:04.879912 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564160-dj9vw"] Mar 18 16:45:04 crc kubenswrapper[4696]: I0318 16:45:04.888881 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564160-dj9vw"] Mar 18 16:45:05 crc kubenswrapper[4696]: I0318 16:45:05.608195 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="301c9939-0c8a-4b1c-82db-96fbef046ff7" path="/var/lib/kubelet/pods/301c9939-0c8a-4b1c-82db-96fbef046ff7/volumes" Mar 18 16:45:09 crc kubenswrapper[4696]: I0318 16:45:09.489605 4696 generic.go:334] "Generic (PLEG): container finished" podID="b0c65f77-ebda-4336-8916-05a15e2fb2a2" containerID="50da90607e61a5fee728f97c7be6fa03f2a20c73c494a2ceed6b3e7df2338e54" exitCode=0 Mar 18 16:45:09 crc kubenswrapper[4696]: I0318 16:45:09.489656 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2zgrk/crc-debug-26zmk" event={"ID":"b0c65f77-ebda-4336-8916-05a15e2fb2a2","Type":"ContainerDied","Data":"50da90607e61a5fee728f97c7be6fa03f2a20c73c494a2ceed6b3e7df2338e54"} Mar 18 16:45:10 crc kubenswrapper[4696]: I0318 16:45:10.671231 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2zgrk/crc-debug-26zmk" Mar 18 16:45:10 crc kubenswrapper[4696]: I0318 16:45:10.715993 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2zgrk/crc-debug-26zmk"] Mar 18 16:45:10 crc kubenswrapper[4696]: I0318 16:45:10.751428 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2zgrk/crc-debug-26zmk"] Mar 18 16:45:10 crc kubenswrapper[4696]: I0318 16:45:10.767066 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0c65f77-ebda-4336-8916-05a15e2fb2a2-host\") pod \"b0c65f77-ebda-4336-8916-05a15e2fb2a2\" (UID: \"b0c65f77-ebda-4336-8916-05a15e2fb2a2\") " Mar 18 16:45:10 crc kubenswrapper[4696]: I0318 16:45:10.767142 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d999r\" (UniqueName: \"kubernetes.io/projected/b0c65f77-ebda-4336-8916-05a15e2fb2a2-kube-api-access-d999r\") pod \"b0c65f77-ebda-4336-8916-05a15e2fb2a2\" (UID: \"b0c65f77-ebda-4336-8916-05a15e2fb2a2\") " Mar 18 16:45:10 crc kubenswrapper[4696]: I0318 16:45:10.768092 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0c65f77-ebda-4336-8916-05a15e2fb2a2-host" (OuterVolumeSpecName: "host") pod "b0c65f77-ebda-4336-8916-05a15e2fb2a2" (UID: "b0c65f77-ebda-4336-8916-05a15e2fb2a2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:45:10 crc kubenswrapper[4696]: I0318 16:45:10.785286 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0c65f77-ebda-4336-8916-05a15e2fb2a2-kube-api-access-d999r" (OuterVolumeSpecName: "kube-api-access-d999r") pod "b0c65f77-ebda-4336-8916-05a15e2fb2a2" (UID: "b0c65f77-ebda-4336-8916-05a15e2fb2a2"). InnerVolumeSpecName "kube-api-access-d999r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:45:10 crc kubenswrapper[4696]: I0318 16:45:10.869348 4696 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0c65f77-ebda-4336-8916-05a15e2fb2a2-host\") on node \"crc\" DevicePath \"\"" Mar 18 16:45:10 crc kubenswrapper[4696]: I0318 16:45:10.869394 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d999r\" (UniqueName: \"kubernetes.io/projected/b0c65f77-ebda-4336-8916-05a15e2fb2a2-kube-api-access-d999r\") on node \"crc\" DevicePath \"\"" Mar 18 16:45:11 crc kubenswrapper[4696]: I0318 16:45:11.515098 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c29eccf29cceb412fca4d693068725bc859d2fdd6c15ae513e5c437c61fe0b1" Mar 18 16:45:11 crc kubenswrapper[4696]: I0318 16:45:11.515164 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2zgrk/crc-debug-26zmk" Mar 18 16:45:11 crc kubenswrapper[4696]: I0318 16:45:11.608607 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0c65f77-ebda-4336-8916-05a15e2fb2a2" path="/var/lib/kubelet/pods/b0c65f77-ebda-4336-8916-05a15e2fb2a2/volumes" Mar 18 16:45:11 crc kubenswrapper[4696]: I0318 16:45:11.930409 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2zgrk/crc-debug-fxxbp"] Mar 18 16:45:11 crc kubenswrapper[4696]: E0318 16:45:11.931807 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0c65f77-ebda-4336-8916-05a15e2fb2a2" containerName="container-00" Mar 18 16:45:11 crc kubenswrapper[4696]: I0318 16:45:11.931877 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0c65f77-ebda-4336-8916-05a15e2fb2a2" containerName="container-00" Mar 18 16:45:11 crc kubenswrapper[4696]: E0318 16:45:11.931938 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aedebe4e-9215-42fe-a9ac-4058d047d5c0" containerName="collect-profiles" Mar 18 16:45:11 crc kubenswrapper[4696]: I0318 16:45:11.931989 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="aedebe4e-9215-42fe-a9ac-4058d047d5c0" containerName="collect-profiles" Mar 18 16:45:11 crc kubenswrapper[4696]: I0318 16:45:11.932254 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0c65f77-ebda-4336-8916-05a15e2fb2a2" containerName="container-00" Mar 18 16:45:11 crc kubenswrapper[4696]: I0318 16:45:11.932330 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="aedebe4e-9215-42fe-a9ac-4058d047d5c0" containerName="collect-profiles" Mar 18 16:45:11 crc kubenswrapper[4696]: I0318 16:45:11.932986 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2zgrk/crc-debug-fxxbp" Mar 18 16:45:11 crc kubenswrapper[4696]: I0318 16:45:11.935731 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-2zgrk"/"default-dockercfg-nv254" Mar 18 16:45:12 crc kubenswrapper[4696]: I0318 16:45:12.097850 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/13740220-e7fc-4157-a356-b80398093860-host\") pod \"crc-debug-fxxbp\" (UID: \"13740220-e7fc-4157-a356-b80398093860\") " pod="openshift-must-gather-2zgrk/crc-debug-fxxbp" Mar 18 16:45:12 crc kubenswrapper[4696]: I0318 16:45:12.098115 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkvzs\" (UniqueName: \"kubernetes.io/projected/13740220-e7fc-4157-a356-b80398093860-kube-api-access-pkvzs\") pod \"crc-debug-fxxbp\" (UID: \"13740220-e7fc-4157-a356-b80398093860\") " pod="openshift-must-gather-2zgrk/crc-debug-fxxbp" Mar 18 16:45:12 crc kubenswrapper[4696]: I0318 16:45:12.200422 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/13740220-e7fc-4157-a356-b80398093860-host\") pod \"crc-debug-fxxbp\" (UID: \"13740220-e7fc-4157-a356-b80398093860\") " pod="openshift-must-gather-2zgrk/crc-debug-fxxbp" Mar 18 16:45:12 crc kubenswrapper[4696]: I0318 16:45:12.200661 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkvzs\" (UniqueName: \"kubernetes.io/projected/13740220-e7fc-4157-a356-b80398093860-kube-api-access-pkvzs\") pod \"crc-debug-fxxbp\" (UID: \"13740220-e7fc-4157-a356-b80398093860\") " pod="openshift-must-gather-2zgrk/crc-debug-fxxbp" Mar 18 16:45:12 crc kubenswrapper[4696]: I0318 16:45:12.200773 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/13740220-e7fc-4157-a356-b80398093860-host\") pod \"crc-debug-fxxbp\" (UID: \"13740220-e7fc-4157-a356-b80398093860\") " pod="openshift-must-gather-2zgrk/crc-debug-fxxbp" Mar 18 16:45:12 crc kubenswrapper[4696]: I0318 16:45:12.219921 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkvzs\" (UniqueName: \"kubernetes.io/projected/13740220-e7fc-4157-a356-b80398093860-kube-api-access-pkvzs\") pod \"crc-debug-fxxbp\" (UID: \"13740220-e7fc-4157-a356-b80398093860\") " pod="openshift-must-gather-2zgrk/crc-debug-fxxbp" Mar 18 16:45:12 crc kubenswrapper[4696]: I0318 16:45:12.248954 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2zgrk/crc-debug-fxxbp" Mar 18 16:45:12 crc kubenswrapper[4696]: I0318 16:45:12.525808 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2zgrk/crc-debug-fxxbp" event={"ID":"13740220-e7fc-4157-a356-b80398093860","Type":"ContainerStarted","Data":"5b25f44701ae4ab3b9aa4d96ffc20eba019191b46f843d42d8218373a6574a0e"} Mar 18 16:45:13 crc kubenswrapper[4696]: I0318 16:45:13.535106 4696 generic.go:334] "Generic (PLEG): container finished" podID="13740220-e7fc-4157-a356-b80398093860" containerID="80b085e568e52d78e894b0950373c1ee953eecc1005f7171d857e3099c80ddcf" exitCode=0 Mar 18 16:45:13 crc kubenswrapper[4696]: I0318 16:45:13.535212 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2zgrk/crc-debug-fxxbp" event={"ID":"13740220-e7fc-4157-a356-b80398093860","Type":"ContainerDied","Data":"80b085e568e52d78e894b0950373c1ee953eecc1005f7171d857e3099c80ddcf"} Mar 18 16:45:13 crc kubenswrapper[4696]: I0318 16:45:13.914452 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2zgrk/crc-debug-fxxbp"] Mar 18 16:45:13 crc kubenswrapper[4696]: I0318 16:45:13.925029 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2zgrk/crc-debug-fxxbp"] Mar 18 16:45:14 crc kubenswrapper[4696]: I0318 16:45:14.638099 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2zgrk/crc-debug-fxxbp" Mar 18 16:45:14 crc kubenswrapper[4696]: I0318 16:45:14.742183 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/13740220-e7fc-4157-a356-b80398093860-host\") pod \"13740220-e7fc-4157-a356-b80398093860\" (UID: \"13740220-e7fc-4157-a356-b80398093860\") " Mar 18 16:45:14 crc kubenswrapper[4696]: I0318 16:45:14.742279 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkvzs\" (UniqueName: \"kubernetes.io/projected/13740220-e7fc-4157-a356-b80398093860-kube-api-access-pkvzs\") pod \"13740220-e7fc-4157-a356-b80398093860\" (UID: \"13740220-e7fc-4157-a356-b80398093860\") " Mar 18 16:45:14 crc kubenswrapper[4696]: I0318 16:45:14.742296 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13740220-e7fc-4157-a356-b80398093860-host" (OuterVolumeSpecName: "host") pod "13740220-e7fc-4157-a356-b80398093860" (UID: "13740220-e7fc-4157-a356-b80398093860"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:45:14 crc kubenswrapper[4696]: I0318 16:45:14.743040 4696 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/13740220-e7fc-4157-a356-b80398093860-host\") on node \"crc\" DevicePath \"\"" Mar 18 16:45:14 crc kubenswrapper[4696]: I0318 16:45:14.747140 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13740220-e7fc-4157-a356-b80398093860-kube-api-access-pkvzs" (OuterVolumeSpecName: "kube-api-access-pkvzs") pod "13740220-e7fc-4157-a356-b80398093860" (UID: "13740220-e7fc-4157-a356-b80398093860"). InnerVolumeSpecName "kube-api-access-pkvzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:45:14 crc kubenswrapper[4696]: I0318 16:45:14.844841 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkvzs\" (UniqueName: \"kubernetes.io/projected/13740220-e7fc-4157-a356-b80398093860-kube-api-access-pkvzs\") on node \"crc\" DevicePath \"\"" Mar 18 16:45:15 crc kubenswrapper[4696]: I0318 16:45:15.088194 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2zgrk/crc-debug-qf688"] Mar 18 16:45:15 crc kubenswrapper[4696]: E0318 16:45:15.088571 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13740220-e7fc-4157-a356-b80398093860" containerName="container-00" Mar 18 16:45:15 crc kubenswrapper[4696]: I0318 16:45:15.088588 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="13740220-e7fc-4157-a356-b80398093860" containerName="container-00" Mar 18 16:45:15 crc kubenswrapper[4696]: I0318 16:45:15.088798 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="13740220-e7fc-4157-a356-b80398093860" containerName="container-00" Mar 18 16:45:15 crc kubenswrapper[4696]: I0318 16:45:15.089392 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2zgrk/crc-debug-qf688" Mar 18 16:45:15 crc kubenswrapper[4696]: I0318 16:45:15.252442 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kzsp\" (UniqueName: \"kubernetes.io/projected/ebf1313e-370f-49f1-a4d6-be6d9c4eed52-kube-api-access-2kzsp\") pod \"crc-debug-qf688\" (UID: \"ebf1313e-370f-49f1-a4d6-be6d9c4eed52\") " pod="openshift-must-gather-2zgrk/crc-debug-qf688" Mar 18 16:45:15 crc kubenswrapper[4696]: I0318 16:45:15.252852 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ebf1313e-370f-49f1-a4d6-be6d9c4eed52-host\") pod \"crc-debug-qf688\" (UID: \"ebf1313e-370f-49f1-a4d6-be6d9c4eed52\") " pod="openshift-must-gather-2zgrk/crc-debug-qf688" Mar 18 16:45:15 crc kubenswrapper[4696]: I0318 16:45:15.355262 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kzsp\" (UniqueName: \"kubernetes.io/projected/ebf1313e-370f-49f1-a4d6-be6d9c4eed52-kube-api-access-2kzsp\") pod \"crc-debug-qf688\" (UID: \"ebf1313e-370f-49f1-a4d6-be6d9c4eed52\") " pod="openshift-must-gather-2zgrk/crc-debug-qf688" Mar 18 16:45:15 crc kubenswrapper[4696]: I0318 16:45:15.355447 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ebf1313e-370f-49f1-a4d6-be6d9c4eed52-host\") pod \"crc-debug-qf688\" (UID: \"ebf1313e-370f-49f1-a4d6-be6d9c4eed52\") " pod="openshift-must-gather-2zgrk/crc-debug-qf688" Mar 18 16:45:15 crc kubenswrapper[4696]: I0318 16:45:15.355576 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ebf1313e-370f-49f1-a4d6-be6d9c4eed52-host\") pod \"crc-debug-qf688\" (UID: \"ebf1313e-370f-49f1-a4d6-be6d9c4eed52\") " pod="openshift-must-gather-2zgrk/crc-debug-qf688" Mar 18 16:45:15 crc kubenswrapper[4696]: I0318 16:45:15.376048 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kzsp\" (UniqueName: \"kubernetes.io/projected/ebf1313e-370f-49f1-a4d6-be6d9c4eed52-kube-api-access-2kzsp\") pod \"crc-debug-qf688\" (UID: \"ebf1313e-370f-49f1-a4d6-be6d9c4eed52\") " pod="openshift-must-gather-2zgrk/crc-debug-qf688" Mar 18 16:45:15 crc kubenswrapper[4696]: I0318 16:45:15.407513 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2zgrk/crc-debug-qf688" Mar 18 16:45:15 crc kubenswrapper[4696]: W0318 16:45:15.439588 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebf1313e_370f_49f1_a4d6_be6d9c4eed52.slice/crio-f04c4bedf85687a0ac1bc21a49bbe8181370131b3dc3663f9edd6d442da8fb1f WatchSource:0}: Error finding container f04c4bedf85687a0ac1bc21a49bbe8181370131b3dc3663f9edd6d442da8fb1f: Status 404 returned error can't find the container with id f04c4bedf85687a0ac1bc21a49bbe8181370131b3dc3663f9edd6d442da8fb1f Mar 18 16:45:15 crc kubenswrapper[4696]: I0318 16:45:15.556808 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2zgrk/crc-debug-fxxbp" Mar 18 16:45:15 crc kubenswrapper[4696]: I0318 16:45:15.556832 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b25f44701ae4ab3b9aa4d96ffc20eba019191b46f843d42d8218373a6574a0e" Mar 18 16:45:15 crc kubenswrapper[4696]: I0318 16:45:15.559305 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2zgrk/crc-debug-qf688" event={"ID":"ebf1313e-370f-49f1-a4d6-be6d9c4eed52","Type":"ContainerStarted","Data":"f04c4bedf85687a0ac1bc21a49bbe8181370131b3dc3663f9edd6d442da8fb1f"} Mar 18 16:45:15 crc kubenswrapper[4696]: I0318 16:45:15.609389 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13740220-e7fc-4157-a356-b80398093860" path="/var/lib/kubelet/pods/13740220-e7fc-4157-a356-b80398093860/volumes" Mar 18 16:45:16 crc kubenswrapper[4696]: I0318 16:45:16.569916 4696 generic.go:334] "Generic (PLEG): container finished" podID="ebf1313e-370f-49f1-a4d6-be6d9c4eed52" containerID="5f78259fc14fece064be8710ca508f2d63c832ac63b6c3341d461caf6384d711" exitCode=0 Mar 18 16:45:16 crc kubenswrapper[4696]: I0318 16:45:16.570016 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2zgrk/crc-debug-qf688" event={"ID":"ebf1313e-370f-49f1-a4d6-be6d9c4eed52","Type":"ContainerDied","Data":"5f78259fc14fece064be8710ca508f2d63c832ac63b6c3341d461caf6384d711"} Mar 18 16:45:16 crc kubenswrapper[4696]: I0318 16:45:16.610046 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2zgrk/crc-debug-qf688"] Mar 18 16:45:16 crc kubenswrapper[4696]: I0318 16:45:16.620323 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2zgrk/crc-debug-qf688"] Mar 18 16:45:18 crc kubenswrapper[4696]: I0318 16:45:18.295871 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2zgrk/crc-debug-qf688" Mar 18 16:45:18 crc kubenswrapper[4696]: I0318 16:45:18.407947 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kzsp\" (UniqueName: \"kubernetes.io/projected/ebf1313e-370f-49f1-a4d6-be6d9c4eed52-kube-api-access-2kzsp\") pod \"ebf1313e-370f-49f1-a4d6-be6d9c4eed52\" (UID: \"ebf1313e-370f-49f1-a4d6-be6d9c4eed52\") " Mar 18 16:45:18 crc kubenswrapper[4696]: I0318 16:45:18.408147 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ebf1313e-370f-49f1-a4d6-be6d9c4eed52-host\") pod \"ebf1313e-370f-49f1-a4d6-be6d9c4eed52\" (UID: \"ebf1313e-370f-49f1-a4d6-be6d9c4eed52\") " Mar 18 16:45:18 crc kubenswrapper[4696]: I0318 16:45:18.408307 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebf1313e-370f-49f1-a4d6-be6d9c4eed52-host" (OuterVolumeSpecName: "host") pod "ebf1313e-370f-49f1-a4d6-be6d9c4eed52" (UID: "ebf1313e-370f-49f1-a4d6-be6d9c4eed52"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:45:18 crc kubenswrapper[4696]: I0318 16:45:18.408807 4696 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ebf1313e-370f-49f1-a4d6-be6d9c4eed52-host\") on node \"crc\" DevicePath \"\"" Mar 18 16:45:18 crc kubenswrapper[4696]: I0318 16:45:18.413183 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebf1313e-370f-49f1-a4d6-be6d9c4eed52-kube-api-access-2kzsp" (OuterVolumeSpecName: "kube-api-access-2kzsp") pod "ebf1313e-370f-49f1-a4d6-be6d9c4eed52" (UID: "ebf1313e-370f-49f1-a4d6-be6d9c4eed52"). InnerVolumeSpecName "kube-api-access-2kzsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:45:18 crc kubenswrapper[4696]: I0318 16:45:18.510876 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kzsp\" (UniqueName: \"kubernetes.io/projected/ebf1313e-370f-49f1-a4d6-be6d9c4eed52-kube-api-access-2kzsp\") on node \"crc\" DevicePath \"\"" Mar 18 16:45:18 crc kubenswrapper[4696]: I0318 16:45:18.606787 4696 scope.go:117] "RemoveContainer" containerID="5f78259fc14fece064be8710ca508f2d63c832ac63b6c3341d461caf6384d711" Mar 18 16:45:18 crc kubenswrapper[4696]: I0318 16:45:18.607108 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2zgrk/crc-debug-qf688" Mar 18 16:45:19 crc kubenswrapper[4696]: I0318 16:45:19.607991 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebf1313e-370f-49f1-a4d6-be6d9c4eed52" path="/var/lib/kubelet/pods/ebf1313e-370f-49f1-a4d6-be6d9c4eed52/volumes" Mar 18 16:45:26 crc kubenswrapper[4696]: I0318 16:45:26.093914 4696 scope.go:117] "RemoveContainer" containerID="ea00a712d0a04b47e3fd09162a7983c453750d0a13341d29532e32fcfbeb7d80" Mar 18 16:45:42 crc kubenswrapper[4696]: I0318 16:45:42.185020 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:45:42 crc kubenswrapper[4696]: I0318 16:45:42.185472 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:45:48 crc kubenswrapper[4696]: I0318 16:45:48.880135 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-756f5c8c54-wjfvb_0d221056-d9f9-47b1-9871-65a83cd55cb4/barbican-api/0.log" Mar 18 16:45:49 crc kubenswrapper[4696]: I0318 16:45:49.044554 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-756f5c8c54-wjfvb_0d221056-d9f9-47b1-9871-65a83cd55cb4/barbican-api-log/0.log" Mar 18 16:45:49 crc kubenswrapper[4696]: I0318 16:45:49.081829 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-745b9b4c58-ztgcm_07475c5d-ee2a-407e-986d-245ada3da65c/barbican-keystone-listener/0.log" Mar 18 16:45:49 crc kubenswrapper[4696]: I0318 16:45:49.134822 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-745b9b4c58-ztgcm_07475c5d-ee2a-407e-986d-245ada3da65c/barbican-keystone-listener-log/0.log" Mar 18 16:45:49 crc kubenswrapper[4696]: I0318 16:45:49.287025 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5d98fdb45f-pcbp7_48631acd-5b2b-48d2-9386-6e023de39655/barbican-worker/0.log" Mar 18 16:45:49 crc kubenswrapper[4696]: I0318 16:45:49.292809 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5d98fdb45f-pcbp7_48631acd-5b2b-48d2-9386-6e023de39655/barbican-worker-log/0.log" Mar 18 16:45:49 crc kubenswrapper[4696]: I0318 16:45:49.888352 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3/ceilometer-notification-agent/0.log" Mar 18 16:45:49 crc kubenswrapper[4696]: I0318 16:45:49.910973 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3/ceilometer-central-agent/0.log" Mar 18 16:45:49 crc kubenswrapper[4696]: I0318 16:45:49.995551 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-rfglh_1f8feb1b-5d39-4cb7-996f-dc5e34065193/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:45:50 crc kubenswrapper[4696]: I0318 16:45:50.060931 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3/proxy-httpd/0.log" Mar 18 16:45:50 crc kubenswrapper[4696]: I0318 16:45:50.102365 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6b0d7ad6-f3f8-4b63-9993-9c4c61ac65a3/sg-core/0.log" Mar 18 16:45:50 crc kubenswrapper[4696]: I0318 16:45:50.233916 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ce02fef3-40fa-46fe-a496-0aada019e24b/cinder-api/0.log" Mar 18 16:45:50 crc kubenswrapper[4696]: I0318 16:45:50.255790 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ce02fef3-40fa-46fe-a496-0aada019e24b/cinder-api-log/0.log" Mar 18 16:45:50 crc kubenswrapper[4696]: I0318 16:45:50.407741 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e2ee43b8-090b-4daf-907b-9a21c3986e42/cinder-scheduler/0.log" Mar 18 16:45:50 crc kubenswrapper[4696]: I0318 16:45:50.442795 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e2ee43b8-090b-4daf-907b-9a21c3986e42/probe/0.log" Mar 18 16:45:50 crc kubenswrapper[4696]: I0318 16:45:50.659203 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-7hmtb_f2c235ca-a193-47df-8495-600e7c8eea37/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:45:50 crc kubenswrapper[4696]: I0318 16:45:50.797428 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-mz7m7_3ffbd0db-84b3-4593-a9f8-7f61bf72fdc6/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:45:50 crc kubenswrapper[4696]: I0318 16:45:50.872977 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-kff5g_b04fc1e7-0f41-46df-90ac-71d0b7d4e29d/init/0.log" Mar 18 16:45:50 crc kubenswrapper[4696]: I0318 16:45:50.970713 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-kff5g_b04fc1e7-0f41-46df-90ac-71d0b7d4e29d/init/0.log" Mar 18 16:45:51 crc kubenswrapper[4696]: I0318 16:45:51.035362 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-kff5g_b04fc1e7-0f41-46df-90ac-71d0b7d4e29d/dnsmasq-dns/0.log" Mar 18 16:45:51 crc kubenswrapper[4696]: I0318 16:45:51.516086 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_9351c230-91b7-40c0-afbc-8adad7604ad4/glance-httpd/0.log" Mar 18 16:45:51 crc kubenswrapper[4696]: I0318 16:45:51.522306 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-wbgp4_c9ae5aa3-8f8f-4951-85ec-1b3583c90481/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:45:51 crc kubenswrapper[4696]: I0318 16:45:51.630199 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_9351c230-91b7-40c0-afbc-8adad7604ad4/glance-log/0.log" Mar 18 16:45:51 crc kubenswrapper[4696]: I0318 16:45:51.707312 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_1f1e526e-e856-452e-8fc6-26663ca20e4a/glance-log/0.log" Mar 18 16:45:51 crc kubenswrapper[4696]: I0318 16:45:51.707503 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_1f1e526e-e856-452e-8fc6-26663ca20e4a/glance-httpd/0.log" Mar 18 16:45:52 crc kubenswrapper[4696]: I0318 16:45:52.014835 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-59764c649b-dxxpb_abd090d6-037c-4cc7-907a-43293ce636ff/horizon/0.log" Mar 18 16:45:52 crc kubenswrapper[4696]: I0318 16:45:52.135240 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-r27gg_f5a70cb2-3b7d-43ab-9ab6-c154a737db7d/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:45:52 crc kubenswrapper[4696]: I0318 16:45:52.321281 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-59764c649b-dxxpb_abd090d6-037c-4cc7-907a-43293ce636ff/horizon-log/0.log" Mar 18 16:45:52 crc kubenswrapper[4696]: I0318 16:45:52.590920 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-8qtfd_aa8fa732-917d-4782-aa47-b1846179b603/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:45:52 crc kubenswrapper[4696]: I0318 16:45:52.636328 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-b5955bfd6-zmfrz_811e96fe-c7fe-424f-b86f-043aaa273d62/keystone-api/0.log" Mar 18 16:45:52 crc kubenswrapper[4696]: I0318 16:45:52.641082 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29564161-mrc5s_e8aacc65-eb04-4cb3-8ab2-fb34b6769db4/keystone-cron/0.log" Mar 18 16:45:52 crc kubenswrapper[4696]: I0318 16:45:52.757979 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6b790582-ecd0-41b7-8f9c-f0ef9d2415db/kube-state-metrics/0.log" Mar 18 16:45:53 crc kubenswrapper[4696]: I0318 16:45:53.162691 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-667fb94989-br52g_13f1595d-6eb1-41a2-8cd9-12d80a38303f/neutron-api/0.log" Mar 18 16:45:53 crc kubenswrapper[4696]: I0318 16:45:53.285811 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-667fb94989-br52g_13f1595d-6eb1-41a2-8cd9-12d80a38303f/neutron-httpd/0.log" Mar 18 16:45:53 crc kubenswrapper[4696]: I0318 16:45:53.351968 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-r8b8p_ded21247-5107-45ab-9b12-25cb76cdfda3/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:45:53 crc kubenswrapper[4696]: I0318 16:45:53.461928 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-z8bp7_a50d071c-8a54-4335-be8c-1842e52dcb81/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:45:53 crc kubenswrapper[4696]: I0318 16:45:53.914733 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_f6247665-ab0d-4101-acfd-c3da0f598788/nova-cell0-conductor-conductor/0.log" Mar 18 16:45:53 crc kubenswrapper[4696]: I0318 16:45:53.964980 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_449c7dea-20e2-4b99-bec6-e3287082418a/nova-api-log/0.log" Mar 18 16:45:54 crc kubenswrapper[4696]: I0318 16:45:54.429609 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_21cc776b-31bc-469a-9b50-930b0480541d/nova-cell1-novncproxy-novncproxy/0.log" Mar 18 16:45:54 crc kubenswrapper[4696]: I0318 16:45:54.431733 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_449c7dea-20e2-4b99-bec6-e3287082418a/nova-api-api/0.log" Mar 18 16:45:54 crc kubenswrapper[4696]: I0318 16:45:54.456538 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_69cc8eab-a88e-49ce-830b-9e352aea0d5f/nova-cell1-conductor-conductor/0.log" Mar 18 16:45:54 crc kubenswrapper[4696]: I0318 16:45:54.797457 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c03e61df-341f-42de-8682-c17255ffedcb/nova-metadata-log/0.log" Mar 18 16:45:55 crc kubenswrapper[4696]: I0318 16:45:55.237273 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c03e61df-341f-42de-8682-c17255ffedcb/nova-metadata-metadata/0.log" Mar 18 16:45:55 crc kubenswrapper[4696]: I0318 16:45:55.257838 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_f49d5ebf-6c40-4bb4-bc0a-0ce72839c86a/nova-scheduler-scheduler/0.log" Mar 18 16:45:55 crc kubenswrapper[4696]: I0318 16:45:55.277619 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-r4dhh_57f3ea1b-d23e-435c-826f-539c401753be/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:45:55 crc kubenswrapper[4696]: I0318 16:45:55.383107 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_87f019c6-a59d-4465-8fb8-c47b198c513b/mysql-bootstrap/0.log" Mar 18 16:45:55 crc kubenswrapper[4696]: I0318 16:45:55.534454 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_87f019c6-a59d-4465-8fb8-c47b198c513b/galera/0.log" Mar 18 16:45:55 crc kubenswrapper[4696]: I0318 16:45:55.586269 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_dd1c1de5-fda6-4306-bde0-736fd76a8f31/mysql-bootstrap/0.log" Mar 18 16:45:55 crc kubenswrapper[4696]: I0318 16:45:55.598415 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_87f019c6-a59d-4465-8fb8-c47b198c513b/mysql-bootstrap/0.log" Mar 18 16:45:55 crc kubenswrapper[4696]: I0318 16:45:55.844292 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_dd1c1de5-fda6-4306-bde0-736fd76a8f31/galera/0.log" Mar 18 16:45:55 crc kubenswrapper[4696]: I0318 16:45:55.878599 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_dd1c1de5-fda6-4306-bde0-736fd76a8f31/mysql-bootstrap/0.log" Mar 18 16:45:55 crc kubenswrapper[4696]: I0318 16:45:55.909126 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_d4b8c82e-7e5c-4db7-8503-fe8d64b60d2e/openstackclient/0.log" Mar 18 16:45:56 crc kubenswrapper[4696]: I0318 16:45:56.078748 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-hb4gt_0b77b78e-7226-4d19-a9b7-190ad5248eb7/openstack-network-exporter/0.log" Mar 18 16:45:56 crc kubenswrapper[4696]: I0318 16:45:56.291776 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-x4bkz_a37c7e7f-336d-4e95-b9ea-3750b49d4117/ovsdb-server-init/0.log" Mar 18 16:45:56 crc kubenswrapper[4696]: I0318 16:45:56.464553 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-x4bkz_a37c7e7f-336d-4e95-b9ea-3750b49d4117/ovsdb-server-init/0.log" Mar 18 16:45:56 crc kubenswrapper[4696]: I0318 16:45:56.490980 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-x4bkz_a37c7e7f-336d-4e95-b9ea-3750b49d4117/ovsdb-server/0.log" Mar 18 16:45:56 crc kubenswrapper[4696]: I0318 16:45:56.515556 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-x4bkz_a37c7e7f-336d-4e95-b9ea-3750b49d4117/ovs-vswitchd/0.log" Mar 18 16:45:56 crc kubenswrapper[4696]: I0318 16:45:56.691167 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-vb7xn_efa7f696-eda9-4cd4-953b-0a24e9935290/ovn-controller/0.log" Mar 18 16:45:56 crc kubenswrapper[4696]: I0318 16:45:56.839767 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-gsd7h_8ac2ae34-5ffd-4557-96ae-c4d268e2cf73/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:45:56 crc kubenswrapper[4696]: I0318 16:45:56.906328 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6d256733-b9f7-484d-873a-b77e062f63c8/openstack-network-exporter/0.log" Mar 18 16:45:57 crc kubenswrapper[4696]: I0318 16:45:57.005226 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6d256733-b9f7-484d-873a-b77e062f63c8/ovn-northd/0.log" Mar 18 16:45:57 crc kubenswrapper[4696]: I0318 16:45:57.061423 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_50019b99-f0df-4582-ab2a-49f761bc0aa7/openstack-network-exporter/0.log" Mar 18 16:45:57 crc kubenswrapper[4696]: I0318 16:45:57.140150 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_50019b99-f0df-4582-ab2a-49f761bc0aa7/ovsdbserver-nb/0.log" Mar 18 16:45:57 crc kubenswrapper[4696]: I0318 16:45:57.255631 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8df5e2e0-02fe-4be7-ae7d-f92ea79ce510/ovsdbserver-sb/0.log" Mar 18 16:45:57 crc kubenswrapper[4696]: I0318 16:45:57.330771 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8df5e2e0-02fe-4be7-ae7d-f92ea79ce510/openstack-network-exporter/0.log" Mar 18 16:45:57 crc kubenswrapper[4696]: I0318 16:45:57.516608 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9749b5588-6wsv8_1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f/placement-api/0.log" Mar 18 16:45:57 crc kubenswrapper[4696]: I0318 16:45:57.589312 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9749b5588-6wsv8_1e8286ea-ad27-496c-bcc2-d0cf5cd5e39f/placement-log/0.log" Mar 18 16:45:57 crc kubenswrapper[4696]: I0318 16:45:57.613877 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ad207e86-aeb6-4af2-a411-dee8342b4fe9/setup-container/0.log" Mar 18 16:45:57 crc kubenswrapper[4696]: I0318 16:45:57.828394 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ad207e86-aeb6-4af2-a411-dee8342b4fe9/setup-container/0.log" Mar 18 16:45:57 crc kubenswrapper[4696]: I0318 16:45:57.916121 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8db68e71-4312-400b-8575-06f87bf6a781/setup-container/0.log" Mar 18 16:45:57 crc kubenswrapper[4696]: I0318 16:45:57.962600 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ad207e86-aeb6-4af2-a411-dee8342b4fe9/rabbitmq/0.log" Mar 18 16:45:58 crc kubenswrapper[4696]: I0318 16:45:58.124318 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8db68e71-4312-400b-8575-06f87bf6a781/setup-container/0.log" Mar 18 16:45:58 crc kubenswrapper[4696]: I0318 16:45:58.138188 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8db68e71-4312-400b-8575-06f87bf6a781/rabbitmq/0.log" Mar 18 16:45:58 crc kubenswrapper[4696]: I0318 16:45:58.172305 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-vf2bg_43fbb202-ffe4-40ba-b61e-ea284e533c1f/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:45:58 crc kubenswrapper[4696]: I0318 16:45:58.361685 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-vxq5r_ae98a130-1216-4906-8e7b-3721a2857935/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:45:58 crc kubenswrapper[4696]: I0318 16:45:58.500914 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-pwcrs_0489a724-0e24-4090-afc8-8d7baec47630/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:45:59 crc kubenswrapper[4696]: I0318 16:45:59.079913 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-rshr7_174da10e-47cb-4e7a-8226-e7a4baeaf2ac/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:45:59 crc kubenswrapper[4696]: I0318 16:45:59.116077 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-v47nc_3460c0d4-77fe-49fd-a525-52b831bf4ff6/ssh-known-hosts-edpm-deployment/0.log" Mar 18 16:45:59 crc kubenswrapper[4696]: I0318 16:45:59.392313 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6847f4969-jlnz4_2418339a-4137-4f64-b098-f0e5011d3f61/proxy-server/0.log" Mar 18 16:45:59 crc kubenswrapper[4696]: I0318 16:45:59.501589 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6847f4969-jlnz4_2418339a-4137-4f64-b098-f0e5011d3f61/proxy-httpd/0.log" Mar 18 16:45:59 crc kubenswrapper[4696]: I0318 16:45:59.503299 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-xj2ch_62b4b14f-0ab3-4906-9c97-8c3092cd5379/swift-ring-rebalance/0.log" Mar 18 16:45:59 crc kubenswrapper[4696]: I0318 16:45:59.613396 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/account-auditor/0.log" Mar 18 16:45:59 crc kubenswrapper[4696]: I0318 16:45:59.703302 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/account-reaper/0.log" Mar 18 16:45:59 crc kubenswrapper[4696]: I0318 16:45:59.776608 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/account-replicator/0.log" Mar 18 16:45:59 crc kubenswrapper[4696]: I0318 16:45:59.832377 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/account-server/0.log" Mar 18 16:45:59 crc kubenswrapper[4696]: I0318 16:45:59.893248 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/container-auditor/0.log" Mar 18 16:45:59 crc kubenswrapper[4696]: I0318 16:45:59.939760 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/container-replicator/0.log" Mar 18 16:46:00 crc kubenswrapper[4696]: I0318 16:46:00.015209 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/container-server/0.log" Mar 18 16:46:00 crc kubenswrapper[4696]: I0318 16:46:00.076896 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/container-updater/0.log" Mar 18 16:46:00 crc kubenswrapper[4696]: I0318 16:46:00.129672 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/object-auditor/0.log" Mar 18 16:46:00 crc kubenswrapper[4696]: I0318 16:46:00.141598 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564206-27ttw"] Mar 18 16:46:00 crc kubenswrapper[4696]: E0318 16:46:00.142130 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf1313e-370f-49f1-a4d6-be6d9c4eed52" containerName="container-00" Mar 18 16:46:00 crc kubenswrapper[4696]: I0318 16:46:00.142152 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf1313e-370f-49f1-a4d6-be6d9c4eed52" containerName="container-00" Mar 18 16:46:00 crc kubenswrapper[4696]: I0318 16:46:00.142392 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebf1313e-370f-49f1-a4d6-be6d9c4eed52" containerName="container-00" Mar 18 16:46:00 crc kubenswrapper[4696]: I0318 16:46:00.143217 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564206-27ttw" Mar 18 16:46:00 crc kubenswrapper[4696]: I0318 16:46:00.149982 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:46:00 crc kubenswrapper[4696]: I0318 16:46:00.150151 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:46:00 crc kubenswrapper[4696]: I0318 16:46:00.149983 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:46:00 crc kubenswrapper[4696]: I0318 16:46:00.150355 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564206-27ttw"] Mar 18 16:46:00 crc kubenswrapper[4696]: I0318 16:46:00.228414 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5htz2\" (UniqueName: \"kubernetes.io/projected/9ffbaf28-f13f-41a5-b4f5-02312fb9cecd-kube-api-access-5htz2\") pod \"auto-csr-approver-29564206-27ttw\" (UID: \"9ffbaf28-f13f-41a5-b4f5-02312fb9cecd\") " pod="openshift-infra/auto-csr-approver-29564206-27ttw" Mar 18 16:46:00 crc kubenswrapper[4696]: I0318 16:46:00.329729 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5htz2\" (UniqueName: \"kubernetes.io/projected/9ffbaf28-f13f-41a5-b4f5-02312fb9cecd-kube-api-access-5htz2\") pod \"auto-csr-approver-29564206-27ttw\" (UID: \"9ffbaf28-f13f-41a5-b4f5-02312fb9cecd\") " pod="openshift-infra/auto-csr-approver-29564206-27ttw" Mar 18 16:46:00 crc kubenswrapper[4696]: I0318 16:46:00.449976 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5htz2\" (UniqueName: \"kubernetes.io/projected/9ffbaf28-f13f-41a5-b4f5-02312fb9cecd-kube-api-access-5htz2\") pod \"auto-csr-approver-29564206-27ttw\" (UID: \"9ffbaf28-f13f-41a5-b4f5-02312fb9cecd\") " pod="openshift-infra/auto-csr-approver-29564206-27ttw" Mar 18 16:46:00 crc kubenswrapper[4696]: I0318 16:46:00.465907 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564206-27ttw" Mar 18 16:46:00 crc kubenswrapper[4696]: I0318 16:46:00.843316 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/object-updater/0.log" Mar 18 16:46:00 crc kubenswrapper[4696]: I0318 16:46:00.867845 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/object-server/0.log" Mar 18 16:46:00 crc kubenswrapper[4696]: I0318 16:46:00.874609 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/object-expirer/0.log" Mar 18 16:46:00 crc kubenswrapper[4696]: I0318 16:46:00.911165 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/object-replicator/0.log" Mar 18 16:46:00 crc kubenswrapper[4696]: I0318 16:46:00.938526 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564206-27ttw"] Mar 18 16:46:01 crc kubenswrapper[4696]: I0318 16:46:01.074374 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/rsync/0.log" Mar 18 16:46:01 crc kubenswrapper[4696]: I0318 16:46:01.091269 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_acfc5351-8c75-4362-8e66-b9ade04d74eb/swift-recon-cron/0.log" Mar 18 16:46:01 crc kubenswrapper[4696]: I0318 16:46:01.366065 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_0b6d2f26-746f-404e-817e-ca3b65cc9511/tempest-tests-tempest-tests-runner/0.log" Mar 18 16:46:01 crc kubenswrapper[4696]: I0318 16:46:01.366479 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564206-27ttw" event={"ID":"9ffbaf28-f13f-41a5-b4f5-02312fb9cecd","Type":"ContainerStarted","Data":"979bcd9f85b097ccba9965fbc8e7e0130b603a98da71402d2c5dca4d371dec7e"} Mar 18 16:46:01 crc kubenswrapper[4696]: I0318 16:46:01.566931 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-w4vdc_9c5cf28b-0e58-48d1-bd91-2a403201c425/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:46:01 crc kubenswrapper[4696]: I0318 16:46:01.602552 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_3166f000-157f-4751-a28c-a2e5f5caa4d9/test-operator-logs-container/0.log" Mar 18 16:46:01 crc kubenswrapper[4696]: I0318 16:46:01.745472 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-zxchz_ffcf2496-8e16-4355-863a-7cad2e2357fe/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 16:46:03 crc kubenswrapper[4696]: I0318 16:46:03.395641 4696 generic.go:334] "Generic (PLEG): container finished" podID="9ffbaf28-f13f-41a5-b4f5-02312fb9cecd" containerID="049a8c71a695690db0006119087c047dba792e9bb0c17029c7fdafffc315b467" exitCode=0 Mar 18 16:46:03 crc kubenswrapper[4696]: I0318 16:46:03.395954 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564206-27ttw" event={"ID":"9ffbaf28-f13f-41a5-b4f5-02312fb9cecd","Type":"ContainerDied","Data":"049a8c71a695690db0006119087c047dba792e9bb0c17029c7fdafffc315b467"} Mar 18 16:46:04 crc kubenswrapper[4696]: I0318 16:46:04.869141 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564206-27ttw" Mar 18 16:46:05 crc kubenswrapper[4696]: I0318 16:46:05.012445 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5htz2\" (UniqueName: \"kubernetes.io/projected/9ffbaf28-f13f-41a5-b4f5-02312fb9cecd-kube-api-access-5htz2\") pod \"9ffbaf28-f13f-41a5-b4f5-02312fb9cecd\" (UID: \"9ffbaf28-f13f-41a5-b4f5-02312fb9cecd\") " Mar 18 16:46:05 crc kubenswrapper[4696]: I0318 16:46:05.019101 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ffbaf28-f13f-41a5-b4f5-02312fb9cecd-kube-api-access-5htz2" (OuterVolumeSpecName: "kube-api-access-5htz2") pod "9ffbaf28-f13f-41a5-b4f5-02312fb9cecd" (UID: "9ffbaf28-f13f-41a5-b4f5-02312fb9cecd"). InnerVolumeSpecName "kube-api-access-5htz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:46:05 crc kubenswrapper[4696]: I0318 16:46:05.115187 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5htz2\" (UniqueName: \"kubernetes.io/projected/9ffbaf28-f13f-41a5-b4f5-02312fb9cecd-kube-api-access-5htz2\") on node \"crc\" DevicePath \"\"" Mar 18 16:46:05 crc kubenswrapper[4696]: I0318 16:46:05.414329 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564206-27ttw" event={"ID":"9ffbaf28-f13f-41a5-b4f5-02312fb9cecd","Type":"ContainerDied","Data":"979bcd9f85b097ccba9965fbc8e7e0130b603a98da71402d2c5dca4d371dec7e"} Mar 18 16:46:05 crc kubenswrapper[4696]: I0318 16:46:05.414744 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="979bcd9f85b097ccba9965fbc8e7e0130b603a98da71402d2c5dca4d371dec7e" Mar 18 16:46:05 crc kubenswrapper[4696]: I0318 16:46:05.414365 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564206-27ttw" Mar 18 16:46:05 crc kubenswrapper[4696]: I0318 16:46:05.940703 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564200-bdfr4"] Mar 18 16:46:05 crc kubenswrapper[4696]: I0318 16:46:05.951362 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564200-bdfr4"] Mar 18 16:46:07 crc kubenswrapper[4696]: I0318 16:46:07.613594 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cfc7b5-4df3-493e-922e-d4a75ba9798b" path="/var/lib/kubelet/pods/87cfc7b5-4df3-493e-922e-d4a75ba9798b/volumes" Mar 18 16:46:11 crc kubenswrapper[4696]: I0318 16:46:11.129713 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_88bcbc43-a512-4f0f-8ce6-e6fd9905df8b/memcached/0.log" Mar 18 16:46:12 crc kubenswrapper[4696]: I0318 16:46:12.184173 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:46:12 crc kubenswrapper[4696]: I0318 16:46:12.184571 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:46:26 crc kubenswrapper[4696]: I0318 16:46:26.198348 4696 scope.go:117] "RemoveContainer" containerID="50f946607503ec571a0fc009a4fe5c8e330237312d9f5dafdc6cfedcf2d64ea6" Mar 18 16:46:27 crc kubenswrapper[4696]: I0318 16:46:27.419115 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45_b2fcb79b-86ba-4c55-babe-94a5edc318fd/util/0.log" Mar 18 16:46:27 crc kubenswrapper[4696]: I0318 16:46:27.584666 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45_b2fcb79b-86ba-4c55-babe-94a5edc318fd/util/0.log" Mar 18 16:46:27 crc kubenswrapper[4696]: I0318 16:46:27.617144 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45_b2fcb79b-86ba-4c55-babe-94a5edc318fd/pull/0.log" Mar 18 16:46:27 crc kubenswrapper[4696]: I0318 16:46:27.633897 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45_b2fcb79b-86ba-4c55-babe-94a5edc318fd/pull/0.log" Mar 18 16:46:27 crc kubenswrapper[4696]: I0318 16:46:27.793466 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45_b2fcb79b-86ba-4c55-babe-94a5edc318fd/extract/0.log" Mar 18 16:46:27 crc kubenswrapper[4696]: I0318 16:46:27.832648 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45_b2fcb79b-86ba-4c55-babe-94a5edc318fd/util/0.log" Mar 18 16:46:27 crc kubenswrapper[4696]: I0318 16:46:27.833846 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a8895v45_b2fcb79b-86ba-4c55-babe-94a5edc318fd/pull/0.log" Mar 18 16:46:28 crc kubenswrapper[4696]: I0318 16:46:28.236902 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6cc65c69fc-r4qqr_00c9dc5c-b65b-4f58-8ef8-06ebc3ec0aac/manager/0.log" Mar 18 16:46:28 crc kubenswrapper[4696]: I0318 16:46:28.585756 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7d559dcdbd-tpb84_123177a5-da82-4485-990a-d5ced4dbf8ca/manager/0.log" Mar 18 16:46:29 crc kubenswrapper[4696]: I0318 16:46:29.193066 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-66dd9d474d-dfclz_1edbcf2a-3a8d-4d2a-8c28-8fe7b66bd7be/manager/0.log" Mar 18 16:46:29 crc kubenswrapper[4696]: I0318 16:46:29.424330 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-64dc66d669-kfqqk_78640d4c-766f-4fd8-ab5f-54687b6fb5c6/manager/0.log" Mar 18 16:46:29 crc kubenswrapper[4696]: I0318 16:46:29.846866 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6b77b7676d-5nkd5_e915aebf-c140-44ee-90b8-ce169df57fd9/manager/0.log" Mar 18 16:46:30 crc kubenswrapper[4696]: I0318 16:46:30.054132 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5595c7d6ff-gggxc_e5ef6f08-4538-435c-b5c8-42bac561d200/manager/0.log" Mar 18 16:46:30 crc kubenswrapper[4696]: I0318 16:46:30.141419 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6d77645966-7s46n_789669f2-e26b-4de8-ad21-801820b5806b/manager/0.log" Mar 18 16:46:30 crc kubenswrapper[4696]: I0318 16:46:30.324777 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-76b87776c9-5s8hj_61d78ab1-c6d3-4dfa-a630-5ccddfd04a0f/manager/0.log" Mar 18 16:46:30 crc kubenswrapper[4696]: I0318 16:46:30.421636 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-fbf7bbb96-v85hd_13174b57-caf5-46f2-8605-51e4de880253/manager/0.log" Mar 18 16:46:30 crc kubenswrapper[4696]: I0318 16:46:30.532341 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6f5b7bcd4-gm92k_411ef48e-d8ac-471f-9018-ee5fd534a4c9/manager/0.log" Mar 18 16:46:30 crc kubenswrapper[4696]: I0318 16:46:30.766785 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6744dd545c-crpf5_e74d1820-3e14-431f-866b-b0ab8b97f20f/manager/0.log" Mar 18 16:46:30 crc kubenswrapper[4696]: I0318 16:46:30.972696 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-bc5c78db9-blz69_e7082f0a-1b24-4fda-b9b2-eb957c569232/manager/0.log" Mar 18 16:46:31 crc kubenswrapper[4696]: I0318 16:46:31.033235 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-56f74467c6-z87fb_afdee753-15ca-42fe-8cc1-937b42d07b85/manager/0.log" Mar 18 16:46:31 crc kubenswrapper[4696]: I0318 16:46:31.229112 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-c5677dc5d-v47tj_0eacab42-0fe3-4d23-b00c-81353faa98f8/manager/0.log" Mar 18 16:46:31 crc kubenswrapper[4696]: I0318 16:46:31.394842 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7bc867c5bc-c7qv6_2a8d5ad7-4bd8-4fc0-864c-9f8da421ff0e/operator/0.log" Mar 18 16:46:31 crc kubenswrapper[4696]: I0318 16:46:31.670885 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-96vqk_d0469dc6-0b23-4d52-9d1a-ed5f1e2cb83d/registry-server/0.log" Mar 18 16:46:31 crc kubenswrapper[4696]: I0318 16:46:31.836597 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-846c4cdcb7-bn6ct_9597433a-1cf7-4455-8aa6-8709fef284dd/manager/0.log" Mar 18 16:46:31 crc kubenswrapper[4696]: I0318 16:46:31.896387 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-659fb58c6b-sbh54_48afb8e4-ed3d-4c76-9be0-15279dda8889/manager/0.log" Mar 18 16:46:32 crc kubenswrapper[4696]: I0318 16:46:32.093456 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-9rttk_b0eb1bc0-9e8c-4836-b7e4-32be5e48bed4/operator/0.log" Mar 18 16:46:32 crc kubenswrapper[4696]: I0318 16:46:32.284091 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-867f54bc44-wf58k_288ae45d-6c8b-4034-8e4d-e2af975bda6f/manager/0.log" Mar 18 16:46:32 crc kubenswrapper[4696]: I0318 16:46:32.473924 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6d84559f47-x7vwf_a08cb64d-e133-4787-956b-4cef003ea78a/manager/0.log" Mar 18 16:46:32 crc kubenswrapper[4696]: I0318 16:46:32.514265 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-8467ccb4c8-p7snf_06cdd947-c4dd-4ccf-bb4b-fffef57443d4/manager/0.log" Mar 18 16:46:32 crc kubenswrapper[4696]: I0318 16:46:32.738341 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-74d6f7b5c-8hndt_fa515a71-3c55-46b7-bab2-60cef0a2b2e1/manager/0.log" Mar 18 16:46:33 crc kubenswrapper[4696]: I0318 16:46:33.070395 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-65fbdb4fdd-njrtk_caa2772a-b8a8-4d65-8b8d-19d9c03c62d6/manager/0.log" Mar 18 16:46:37 crc kubenswrapper[4696]: I0318 16:46:37.636694 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5cfd84c587-g2jrg_ef87c345-1284-41dd-a5ae-57ae08c9558e/manager/0.log" Mar 18 16:46:42 crc kubenswrapper[4696]: I0318 16:46:42.184476 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:46:42 crc kubenswrapper[4696]: I0318 16:46:42.185099 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:46:42 crc kubenswrapper[4696]: I0318 16:46:42.185151 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 16:46:42 crc kubenswrapper[4696]: I0318 16:46:42.185877 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"83f8bb1de200eeb57fe6463ccfbd7d698a465fd68278d22a98885c91af6e914f"} pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:46:42 crc kubenswrapper[4696]: I0318 16:46:42.185921 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" containerID="cri-o://83f8bb1de200eeb57fe6463ccfbd7d698a465fd68278d22a98885c91af6e914f" gracePeriod=600 Mar 18 16:46:42 crc kubenswrapper[4696]: I0318 16:46:42.755666 4696 generic.go:334] "Generic (PLEG): container finished" podID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerID="83f8bb1de200eeb57fe6463ccfbd7d698a465fd68278d22a98885c91af6e914f" exitCode=0 Mar 18 16:46:42 crc kubenswrapper[4696]: I0318 16:46:42.755739 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerDied","Data":"83f8bb1de200eeb57fe6463ccfbd7d698a465fd68278d22a98885c91af6e914f"} Mar 18 16:46:42 crc kubenswrapper[4696]: I0318 16:46:42.755987 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerStarted","Data":"60f8abfccc6b4b7bf69c4148a2fd17d059c88d1fd099f3acbb170d79acafc5dc"} Mar 18 16:46:42 crc kubenswrapper[4696]: I0318 16:46:42.756007 4696 scope.go:117] "RemoveContainer" containerID="16de38db2158798a363801b31f48f3fc4701157176a026844e97be1941944942" Mar 18 16:46:52 crc kubenswrapper[4696]: I0318 16:46:52.760052 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-gx9sb_b0297eb3-0438-4db1-97bd-405779a01255/control-plane-machine-set-operator/0.log" Mar 18 16:46:52 crc kubenswrapper[4696]: I0318 16:46:52.970339 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-6xjgb_0c05edf1-5079-4212-ba5c-19621b2500cf/kube-rbac-proxy/0.log" Mar 18 16:46:52 crc kubenswrapper[4696]: I0318 16:46:52.971648 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-6xjgb_0c05edf1-5079-4212-ba5c-19621b2500cf/machine-api-operator/0.log" Mar 18 16:47:06 crc kubenswrapper[4696]: I0318 16:47:06.948011 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-lhsfv_e65694c1-c09d-4ab3-8032-640197b84e20/cert-manager-controller/0.log" Mar 18 16:47:07 crc kubenswrapper[4696]: I0318 16:47:07.129621 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-wmgb2_e4aa5b80-28c0-4b73-91ef-a0b1325d7823/cert-manager-cainjector/0.log" Mar 18 16:47:07 crc kubenswrapper[4696]: I0318 16:47:07.161064 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-7ncrl_ed08594c-854d-4c4d-8390-025916809f21/cert-manager-webhook/0.log" Mar 18 16:47:20 crc kubenswrapper[4696]: I0318 16:47:20.450896 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-pngnv_ea18cdb1-cb1f-46b3-af17-e834b51c6803/nmstate-console-plugin/0.log" Mar 18 16:47:20 crc kubenswrapper[4696]: I0318 16:47:20.685494 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-qtwr9_7c0e2cb0-b684-45aa-a7cc-9e9cd0e34704/nmstate-handler/0.log" Mar 18 16:47:20 crc kubenswrapper[4696]: I0318 16:47:20.731740 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-jlrq4_90a85ff7-6f9a-40c4-b528-15f0c3739a2b/kube-rbac-proxy/0.log" Mar 18 16:47:20 crc kubenswrapper[4696]: I0318 16:47:20.787456 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-jlrq4_90a85ff7-6f9a-40c4-b528-15f0c3739a2b/nmstate-metrics/0.log" Mar 18 16:47:20 crc kubenswrapper[4696]: I0318 16:47:20.880395 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-f8s88_3c50fff4-65a2-49c1-997a-658bc72f1fe7/nmstate-operator/0.log" Mar 18 16:47:20 crc kubenswrapper[4696]: I0318 16:47:20.975516 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-pn2r6_d0117f34-5320-46c0-952f-54d4abacdce4/nmstate-webhook/0.log" Mar 18 16:47:49 crc kubenswrapper[4696]: I0318 16:47:49.054081 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-fnvx2_e1b08d64-c01c-4cb5-b1ee-8cfc03868c70/kube-rbac-proxy/0.log" Mar 18 16:47:49 crc kubenswrapper[4696]: I0318 16:47:49.140020 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-fnvx2_e1b08d64-c01c-4cb5-b1ee-8cfc03868c70/controller/0.log" Mar 18 16:47:49 crc kubenswrapper[4696]: I0318 16:47:49.241088 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/cp-frr-files/0.log" Mar 18 16:47:49 crc kubenswrapper[4696]: I0318 16:47:49.435626 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/cp-reloader/0.log" Mar 18 16:47:49 crc kubenswrapper[4696]: I0318 16:47:49.487508 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/cp-frr-files/0.log" Mar 18 16:47:49 crc kubenswrapper[4696]: I0318 16:47:49.495140 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/cp-metrics/0.log" Mar 18 16:47:49 crc kubenswrapper[4696]: I0318 16:47:49.509880 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/cp-reloader/0.log" Mar 18 16:47:49 crc kubenswrapper[4696]: I0318 16:47:49.763360 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/cp-reloader/0.log" Mar 18 16:47:49 crc kubenswrapper[4696]: I0318 16:47:49.770633 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/cp-frr-files/0.log" Mar 18 16:47:49 crc kubenswrapper[4696]: I0318 16:47:49.775261 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/cp-metrics/0.log" Mar 18 16:47:49 crc kubenswrapper[4696]: I0318 16:47:49.786095 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/cp-metrics/0.log" Mar 18 16:47:49 crc kubenswrapper[4696]: I0318 16:47:49.937130 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/cp-reloader/0.log" Mar 18 16:47:49 crc kubenswrapper[4696]: I0318 16:47:49.942625 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/cp-frr-files/0.log" Mar 18 16:47:49 crc kubenswrapper[4696]: I0318 16:47:49.972001 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/controller/0.log" Mar 18 16:47:49 crc kubenswrapper[4696]: I0318 16:47:49.998585 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/cp-metrics/0.log" Mar 18 16:47:50 crc kubenswrapper[4696]: I0318 16:47:50.434995 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/kube-rbac-proxy/0.log" Mar 18 16:47:50 crc kubenswrapper[4696]: I0318 16:47:50.458042 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/frr-metrics/0.log" Mar 18 16:47:50 crc kubenswrapper[4696]: I0318 16:47:50.458664 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/kube-rbac-proxy-frr/0.log" Mar 18 16:47:50 crc kubenswrapper[4696]: I0318 16:47:50.682139 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/reloader/0.log" Mar 18 16:47:50 crc kubenswrapper[4696]: I0318 16:47:50.701753 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-whtkt_58a00a45-b349-471d-816b-a05268da02e4/frr-k8s-webhook-server/0.log" Mar 18 16:47:50 crc kubenswrapper[4696]: I0318 16:47:50.983782 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-76ff64997f-7v6kl_d823fa6b-b1c9-4c8e-9da9-49e457c2fae6/manager/0.log" Mar 18 16:47:51 crc kubenswrapper[4696]: I0318 16:47:51.114099 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-c9479f99b-72fxd_b28709b3-7641-45cc-9e79-9be140d2bcae/webhook-server/0.log" Mar 18 16:47:51 crc kubenswrapper[4696]: I0318 16:47:51.248105 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pr9g2_5556d2f1-1113-45f4-89d4-deea421bb0aa/kube-rbac-proxy/0.log" Mar 18 16:47:51 crc kubenswrapper[4696]: I0318 16:47:51.814855 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pr9g2_5556d2f1-1113-45f4-89d4-deea421bb0aa/speaker/0.log" Mar 18 16:47:52 crc kubenswrapper[4696]: I0318 16:47:52.076574 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r98m9_ccff3265-0675-4907-bcae-b20d0ebddd56/frr/0.log" Mar 18 16:48:00 crc kubenswrapper[4696]: I0318 16:48:00.148805 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564208-p42b7"] Mar 18 16:48:00 crc kubenswrapper[4696]: E0318 16:48:00.149814 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ffbaf28-f13f-41a5-b4f5-02312fb9cecd" containerName="oc" Mar 18 16:48:00 crc kubenswrapper[4696]: I0318 16:48:00.149830 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ffbaf28-f13f-41a5-b4f5-02312fb9cecd" containerName="oc" Mar 18 16:48:00 crc kubenswrapper[4696]: I0318 16:48:00.149991 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ffbaf28-f13f-41a5-b4f5-02312fb9cecd" containerName="oc" Mar 18 16:48:00 crc kubenswrapper[4696]: I0318 16:48:00.150705 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564208-p42b7" Mar 18 16:48:00 crc kubenswrapper[4696]: I0318 16:48:00.152824 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:48:00 crc kubenswrapper[4696]: I0318 16:48:00.152947 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:48:00 crc kubenswrapper[4696]: I0318 16:48:00.153901 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:48:00 crc kubenswrapper[4696]: I0318 16:48:00.163695 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564208-p42b7"] Mar 18 16:48:00 crc kubenswrapper[4696]: I0318 16:48:00.275447 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnv2r\" (UniqueName: \"kubernetes.io/projected/41402f40-4553-4c68-b0be-b3dd1e748da7-kube-api-access-fnv2r\") pod \"auto-csr-approver-29564208-p42b7\" (UID: \"41402f40-4553-4c68-b0be-b3dd1e748da7\") " pod="openshift-infra/auto-csr-approver-29564208-p42b7" Mar 18 16:48:00 crc kubenswrapper[4696]: I0318 16:48:00.376997 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnv2r\" (UniqueName: \"kubernetes.io/projected/41402f40-4553-4c68-b0be-b3dd1e748da7-kube-api-access-fnv2r\") pod \"auto-csr-approver-29564208-p42b7\" (UID: \"41402f40-4553-4c68-b0be-b3dd1e748da7\") " pod="openshift-infra/auto-csr-approver-29564208-p42b7" Mar 18 16:48:00 crc kubenswrapper[4696]: I0318 16:48:00.394212 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnv2r\" (UniqueName: \"kubernetes.io/projected/41402f40-4553-4c68-b0be-b3dd1e748da7-kube-api-access-fnv2r\") pod \"auto-csr-approver-29564208-p42b7\" (UID: \"41402f40-4553-4c68-b0be-b3dd1e748da7\") " pod="openshift-infra/auto-csr-approver-29564208-p42b7" Mar 18 16:48:00 crc kubenswrapper[4696]: I0318 16:48:00.470957 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564208-p42b7" Mar 18 16:48:00 crc kubenswrapper[4696]: I0318 16:48:00.911355 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564208-p42b7"] Mar 18 16:48:00 crc kubenswrapper[4696]: I0318 16:48:00.914555 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:48:01 crc kubenswrapper[4696]: I0318 16:48:01.414909 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564208-p42b7" event={"ID":"41402f40-4553-4c68-b0be-b3dd1e748da7","Type":"ContainerStarted","Data":"fa33d0e7e10841c83cce84ca3752b1bf427b6d5879ec8faddc441511972c2a8f"} Mar 18 16:48:02 crc kubenswrapper[4696]: I0318 16:48:02.425993 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564208-p42b7" event={"ID":"41402f40-4553-4c68-b0be-b3dd1e748da7","Type":"ContainerStarted","Data":"577f23e625c7e15aca5138d2b97f5e1441e4db6236dd7042cac066abb43cd687"} Mar 18 16:48:02 crc kubenswrapper[4696]: I0318 16:48:02.451601 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564208-p42b7" podStartSLOduration=1.2386800629999999 podStartE2EDuration="2.451581632s" podCreationTimestamp="2026-03-18 16:48:00 +0000 UTC" firstStartedPulling="2026-03-18 16:48:00.914254532 +0000 UTC m=+4323.920428748" lastFinishedPulling="2026-03-18 16:48:02.127156111 +0000 UTC m=+4325.133330317" observedRunningTime="2026-03-18 16:48:02.44319937 +0000 UTC m=+4325.449373576" watchObservedRunningTime="2026-03-18 16:48:02.451581632 +0000 UTC m=+4325.457755838" Mar 18 16:48:03 crc kubenswrapper[4696]: I0318 16:48:03.435367 4696 generic.go:334] "Generic (PLEG): container finished" podID="41402f40-4553-4c68-b0be-b3dd1e748da7" containerID="577f23e625c7e15aca5138d2b97f5e1441e4db6236dd7042cac066abb43cd687" exitCode=0 Mar 18 16:48:03 crc kubenswrapper[4696]: I0318 16:48:03.435427 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564208-p42b7" event={"ID":"41402f40-4553-4c68-b0be-b3dd1e748da7","Type":"ContainerDied","Data":"577f23e625c7e15aca5138d2b97f5e1441e4db6236dd7042cac066abb43cd687"} Mar 18 16:48:04 crc kubenswrapper[4696]: I0318 16:48:04.202678 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh_60e108b1-f55a-4685-8219-7be977826f05/util/0.log" Mar 18 16:48:04 crc kubenswrapper[4696]: I0318 16:48:04.347615 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh_60e108b1-f55a-4685-8219-7be977826f05/pull/0.log" Mar 18 16:48:04 crc kubenswrapper[4696]: I0318 16:48:04.368690 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh_60e108b1-f55a-4685-8219-7be977826f05/util/0.log" Mar 18 16:48:04 crc kubenswrapper[4696]: I0318 16:48:04.412477 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh_60e108b1-f55a-4685-8219-7be977826f05/pull/0.log" Mar 18 16:48:04 crc kubenswrapper[4696]: I0318 16:48:04.602892 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh_60e108b1-f55a-4685-8219-7be977826f05/util/0.log" Mar 18 16:48:04 crc kubenswrapper[4696]: I0318 16:48:04.653280 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh_60e108b1-f55a-4685-8219-7be977826f05/pull/0.log" Mar 18 16:48:04 crc kubenswrapper[4696]: I0318 16:48:04.698871 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dlcmh_60e108b1-f55a-4685-8219-7be977826f05/extract/0.log" Mar 18 16:48:04 crc kubenswrapper[4696]: I0318 16:48:04.805959 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft_70134d43-b7f8-4a91-864c-9e680a9e8ae9/util/0.log" Mar 18 16:48:04 crc kubenswrapper[4696]: I0318 16:48:04.807994 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564208-p42b7" Mar 18 16:48:04 crc kubenswrapper[4696]: I0318 16:48:04.961908 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnv2r\" (UniqueName: \"kubernetes.io/projected/41402f40-4553-4c68-b0be-b3dd1e748da7-kube-api-access-fnv2r\") pod \"41402f40-4553-4c68-b0be-b3dd1e748da7\" (UID: \"41402f40-4553-4c68-b0be-b3dd1e748da7\") " Mar 18 16:48:04 crc kubenswrapper[4696]: I0318 16:48:04.967701 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41402f40-4553-4c68-b0be-b3dd1e748da7-kube-api-access-fnv2r" (OuterVolumeSpecName: "kube-api-access-fnv2r") pod "41402f40-4553-4c68-b0be-b3dd1e748da7" (UID: "41402f40-4553-4c68-b0be-b3dd1e748da7"). InnerVolumeSpecName "kube-api-access-fnv2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:48:05 crc kubenswrapper[4696]: I0318 16:48:05.028796 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft_70134d43-b7f8-4a91-864c-9e680a9e8ae9/pull/0.log" Mar 18 16:48:05 crc kubenswrapper[4696]: I0318 16:48:05.036021 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft_70134d43-b7f8-4a91-864c-9e680a9e8ae9/pull/0.log" Mar 18 16:48:05 crc kubenswrapper[4696]: I0318 16:48:05.058088 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft_70134d43-b7f8-4a91-864c-9e680a9e8ae9/util/0.log" Mar 18 16:48:05 crc kubenswrapper[4696]: I0318 16:48:05.064309 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnv2r\" (UniqueName: \"kubernetes.io/projected/41402f40-4553-4c68-b0be-b3dd1e748da7-kube-api-access-fnv2r\") on node \"crc\" DevicePath \"\"" Mar 18 16:48:05 crc kubenswrapper[4696]: I0318 16:48:05.208858 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft_70134d43-b7f8-4a91-864c-9e680a9e8ae9/pull/0.log" Mar 18 16:48:05 crc kubenswrapper[4696]: I0318 16:48:05.210493 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft_70134d43-b7f8-4a91-864c-9e680a9e8ae9/util/0.log" Mar 18 16:48:05 crc kubenswrapper[4696]: I0318 16:48:05.251669 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1zsfft_70134d43-b7f8-4a91-864c-9e680a9e8ae9/extract/0.log" Mar 18 16:48:05 crc kubenswrapper[4696]: I0318 16:48:05.382741 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mvwp5_c0f11748-034a-4f55-9da4-ee34ca565a33/extract-utilities/0.log" Mar 18 16:48:05 crc kubenswrapper[4696]: I0318 16:48:05.454087 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564208-p42b7" event={"ID":"41402f40-4553-4c68-b0be-b3dd1e748da7","Type":"ContainerDied","Data":"fa33d0e7e10841c83cce84ca3752b1bf427b6d5879ec8faddc441511972c2a8f"} Mar 18 16:48:05 crc kubenswrapper[4696]: I0318 16:48:05.454126 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa33d0e7e10841c83cce84ca3752b1bf427b6d5879ec8faddc441511972c2a8f" Mar 18 16:48:05 crc kubenswrapper[4696]: I0318 16:48:05.454142 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564208-p42b7" Mar 18 16:48:05 crc kubenswrapper[4696]: I0318 16:48:05.503909 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564202-p2zwh"] Mar 18 16:48:05 crc kubenswrapper[4696]: I0318 16:48:05.514308 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564202-p2zwh"] Mar 18 16:48:05 crc kubenswrapper[4696]: I0318 16:48:05.549937 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mvwp5_c0f11748-034a-4f55-9da4-ee34ca565a33/extract-utilities/0.log" Mar 18 16:48:05 crc kubenswrapper[4696]: I0318 16:48:05.595642 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mvwp5_c0f11748-034a-4f55-9da4-ee34ca565a33/extract-content/0.log" Mar 18 16:48:05 crc kubenswrapper[4696]: I0318 16:48:05.598384 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mvwp5_c0f11748-034a-4f55-9da4-ee34ca565a33/extract-content/0.log" Mar 18 16:48:05 crc kubenswrapper[4696]: I0318 16:48:05.606796 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d625c9e2-dc61-4c51-ad5d-5d38203c3739" path="/var/lib/kubelet/pods/d625c9e2-dc61-4c51-ad5d-5d38203c3739/volumes" Mar 18 16:48:05 crc kubenswrapper[4696]: I0318 16:48:05.731019 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mvwp5_c0f11748-034a-4f55-9da4-ee34ca565a33/extract-utilities/0.log" Mar 18 16:48:05 crc kubenswrapper[4696]: I0318 16:48:05.784646 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mvwp5_c0f11748-034a-4f55-9da4-ee34ca565a33/extract-content/0.log" Mar 18 16:48:05 crc kubenswrapper[4696]: I0318 16:48:05.978091 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dbfmt_c88d01e2-45ab-4111-9029-3f3e2c12c58e/extract-utilities/0.log" Mar 18 16:48:06 crc kubenswrapper[4696]: I0318 16:48:06.200177 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dbfmt_c88d01e2-45ab-4111-9029-3f3e2c12c58e/extract-utilities/0.log" Mar 18 16:48:06 crc kubenswrapper[4696]: I0318 16:48:06.219872 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dbfmt_c88d01e2-45ab-4111-9029-3f3e2c12c58e/extract-content/0.log" Mar 18 16:48:06 crc kubenswrapper[4696]: I0318 16:48:06.276284 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dbfmt_c88d01e2-45ab-4111-9029-3f3e2c12c58e/extract-content/0.log" Mar 18 16:48:06 crc kubenswrapper[4696]: I0318 16:48:06.416863 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mvwp5_c0f11748-034a-4f55-9da4-ee34ca565a33/registry-server/0.log" Mar 18 16:48:06 crc kubenswrapper[4696]: I0318 16:48:06.462826 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dbfmt_c88d01e2-45ab-4111-9029-3f3e2c12c58e/extract-content/0.log" Mar 18 16:48:06 crc kubenswrapper[4696]: I0318 16:48:06.512363 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dbfmt_c88d01e2-45ab-4111-9029-3f3e2c12c58e/extract-utilities/0.log" Mar 18 16:48:06 crc kubenswrapper[4696]: I0318 16:48:06.695043 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-v84rb_71a9fceb-5471-42f1-867d-28f7196daf81/marketplace-operator/0.log" Mar 18 16:48:06 crc kubenswrapper[4696]: I0318 16:48:06.924454 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g8lp7_a4c082b9-ab53-4ff0-94db-c714a1fc683e/extract-utilities/0.log" Mar 18 16:48:07 crc kubenswrapper[4696]: I0318 16:48:07.105897 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g8lp7_a4c082b9-ab53-4ff0-94db-c714a1fc683e/extract-content/0.log" Mar 18 16:48:07 crc kubenswrapper[4696]: I0318 16:48:07.136628 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g8lp7_a4c082b9-ab53-4ff0-94db-c714a1fc683e/extract-utilities/0.log" Mar 18 16:48:07 crc kubenswrapper[4696]: I0318 16:48:07.189850 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g8lp7_a4c082b9-ab53-4ff0-94db-c714a1fc683e/extract-content/0.log" Mar 18 16:48:07 crc kubenswrapper[4696]: I0318 16:48:07.251716 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dbfmt_c88d01e2-45ab-4111-9029-3f3e2c12c58e/registry-server/0.log" Mar 18 16:48:07 crc kubenswrapper[4696]: I0318 16:48:07.366968 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g8lp7_a4c082b9-ab53-4ff0-94db-c714a1fc683e/extract-utilities/0.log" Mar 18 16:48:07 crc kubenswrapper[4696]: I0318 16:48:07.403967 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g8lp7_a4c082b9-ab53-4ff0-94db-c714a1fc683e/extract-content/0.log" Mar 18 16:48:07 crc kubenswrapper[4696]: I0318 16:48:07.569980 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g8lp7_a4c082b9-ab53-4ff0-94db-c714a1fc683e/registry-server/0.log" Mar 18 16:48:07 crc kubenswrapper[4696]: I0318 16:48:07.639915 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sb4zx_7b73d2de-d188-4d58-84da-b09309189aa1/extract-utilities/0.log" Mar 18 16:48:07 crc kubenswrapper[4696]: I0318 16:48:07.726362 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sb4zx_7b73d2de-d188-4d58-84da-b09309189aa1/extract-content/0.log" Mar 18 16:48:07 crc kubenswrapper[4696]: I0318 16:48:07.766347 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sb4zx_7b73d2de-d188-4d58-84da-b09309189aa1/extract-content/0.log" Mar 18 16:48:07 crc kubenswrapper[4696]: I0318 16:48:07.797234 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sb4zx_7b73d2de-d188-4d58-84da-b09309189aa1/extract-utilities/0.log" Mar 18 16:48:07 crc kubenswrapper[4696]: I0318 16:48:07.898142 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sb4zx_7b73d2de-d188-4d58-84da-b09309189aa1/extract-content/0.log" Mar 18 16:48:07 crc kubenswrapper[4696]: I0318 16:48:07.935676 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sb4zx_7b73d2de-d188-4d58-84da-b09309189aa1/extract-utilities/0.log" Mar 18 16:48:08 crc kubenswrapper[4696]: I0318 16:48:08.140792 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sb4zx_7b73d2de-d188-4d58-84da-b09309189aa1/registry-server/0.log" Mar 18 16:48:26 crc kubenswrapper[4696]: I0318 16:48:26.288735 4696 scope.go:117] "RemoveContainer" containerID="c9f027914cf1a7e9dd9c570f472cec519a4a2481e23f87f1b268461f6106e893" Mar 18 16:48:42 crc kubenswrapper[4696]: I0318 16:48:42.184277 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:48:42 crc kubenswrapper[4696]: I0318 16:48:42.184853 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:49:12 crc kubenswrapper[4696]: I0318 16:49:12.184712 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:49:12 crc kubenswrapper[4696]: I0318 16:49:12.185328 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:49:42 crc kubenswrapper[4696]: I0318 16:49:42.187452 4696 patch_prober.go:28] interesting pod/machine-config-daemon-jjkqr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:49:42 crc kubenswrapper[4696]: I0318 16:49:42.188068 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:49:42 crc kubenswrapper[4696]: I0318 16:49:42.188122 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" Mar 18 16:49:42 crc kubenswrapper[4696]: I0318 16:49:42.188971 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"60f8abfccc6b4b7bf69c4148a2fd17d059c88d1fd099f3acbb170d79acafc5dc"} pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:49:42 crc kubenswrapper[4696]: I0318 16:49:42.189031 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerName="machine-config-daemon" containerID="cri-o://60f8abfccc6b4b7bf69c4148a2fd17d059c88d1fd099f3acbb170d79acafc5dc" gracePeriod=600 Mar 18 16:49:42 crc kubenswrapper[4696]: E0318 16:49:42.340118 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:49:42 crc kubenswrapper[4696]: I0318 16:49:42.342475 4696 generic.go:334] "Generic (PLEG): container finished" podID="d74b6f45-9bfc-4439-b43b-03f441c544fd" containerID="60f8abfccc6b4b7bf69c4148a2fd17d059c88d1fd099f3acbb170d79acafc5dc" exitCode=0 Mar 18 16:49:42 crc kubenswrapper[4696]: I0318 16:49:42.342528 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" event={"ID":"d74b6f45-9bfc-4439-b43b-03f441c544fd","Type":"ContainerDied","Data":"60f8abfccc6b4b7bf69c4148a2fd17d059c88d1fd099f3acbb170d79acafc5dc"} Mar 18 16:49:42 crc kubenswrapper[4696]: I0318 16:49:42.342560 4696 scope.go:117] "RemoveContainer" containerID="83f8bb1de200eeb57fe6463ccfbd7d698a465fd68278d22a98885c91af6e914f" Mar 18 16:49:43 crc kubenswrapper[4696]: I0318 16:49:43.354190 4696 scope.go:117] "RemoveContainer" containerID="60f8abfccc6b4b7bf69c4148a2fd17d059c88d1fd099f3acbb170d79acafc5dc" Mar 18 16:49:43 crc kubenswrapper[4696]: E0318 16:49:43.354618 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:49:52 crc kubenswrapper[4696]: E0318 16:49:52.000210 4696 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Mar 18 16:49:56 crc kubenswrapper[4696]: I0318 16:49:56.597139 4696 scope.go:117] "RemoveContainer" containerID="60f8abfccc6b4b7bf69c4148a2fd17d059c88d1fd099f3acbb170d79acafc5dc" Mar 18 16:49:56 crc kubenswrapper[4696]: E0318 16:49:56.598101 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:49:57 crc kubenswrapper[4696]: I0318 16:49:57.482569 4696 generic.go:334] "Generic (PLEG): container finished" podID="b7b100a7-0df9-496f-bb44-424109bd8c96" containerID="1039eb63d272641f85c38d2bea4edc2806860f49852d47de8c5b8a40ca786a58" exitCode=0 Mar 18 16:49:57 crc kubenswrapper[4696]: I0318 16:49:57.482684 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2zgrk/must-gather-cm5bk" event={"ID":"b7b100a7-0df9-496f-bb44-424109bd8c96","Type":"ContainerDied","Data":"1039eb63d272641f85c38d2bea4edc2806860f49852d47de8c5b8a40ca786a58"} Mar 18 16:49:57 crc kubenswrapper[4696]: I0318 16:49:57.484136 4696 scope.go:117] "RemoveContainer" containerID="1039eb63d272641f85c38d2bea4edc2806860f49852d47de8c5b8a40ca786a58" Mar 18 16:49:58 crc kubenswrapper[4696]: I0318 16:49:58.219610 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2zgrk_must-gather-cm5bk_b7b100a7-0df9-496f-bb44-424109bd8c96/gather/0.log" Mar 18 16:50:00 crc kubenswrapper[4696]: I0318 16:50:00.145024 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564210-nn9gd"] Mar 18 16:50:00 crc kubenswrapper[4696]: E0318 16:50:00.145786 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41402f40-4553-4c68-b0be-b3dd1e748da7" containerName="oc" Mar 18 16:50:00 crc kubenswrapper[4696]: I0318 16:50:00.145800 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="41402f40-4553-4c68-b0be-b3dd1e748da7" containerName="oc" Mar 18 16:50:00 crc kubenswrapper[4696]: I0318 16:50:00.145995 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="41402f40-4553-4c68-b0be-b3dd1e748da7" containerName="oc" Mar 18 16:50:00 crc kubenswrapper[4696]: I0318 16:50:00.146646 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564210-nn9gd" Mar 18 16:50:00 crc kubenswrapper[4696]: I0318 16:50:00.148953 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:50:00 crc kubenswrapper[4696]: I0318 16:50:00.149192 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:50:00 crc kubenswrapper[4696]: I0318 16:50:00.149711 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:50:00 crc kubenswrapper[4696]: I0318 16:50:00.168216 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564210-nn9gd"] Mar 18 16:50:00 crc kubenswrapper[4696]: I0318 16:50:00.301062 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tchxj\" (UniqueName: \"kubernetes.io/projected/79c518f5-d6de-45f0-8413-13b5bfe79fdf-kube-api-access-tchxj\") pod \"auto-csr-approver-29564210-nn9gd\" (UID: \"79c518f5-d6de-45f0-8413-13b5bfe79fdf\") " pod="openshift-infra/auto-csr-approver-29564210-nn9gd" Mar 18 16:50:00 crc kubenswrapper[4696]: I0318 16:50:00.402933 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tchxj\" (UniqueName: \"kubernetes.io/projected/79c518f5-d6de-45f0-8413-13b5bfe79fdf-kube-api-access-tchxj\") pod \"auto-csr-approver-29564210-nn9gd\" (UID: \"79c518f5-d6de-45f0-8413-13b5bfe79fdf\") " pod="openshift-infra/auto-csr-approver-29564210-nn9gd" Mar 18 16:50:00 crc kubenswrapper[4696]: I0318 16:50:00.429120 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tchxj\" (UniqueName: \"kubernetes.io/projected/79c518f5-d6de-45f0-8413-13b5bfe79fdf-kube-api-access-tchxj\") pod \"auto-csr-approver-29564210-nn9gd\" (UID: \"79c518f5-d6de-45f0-8413-13b5bfe79fdf\") " pod="openshift-infra/auto-csr-approver-29564210-nn9gd" Mar 18 16:50:00 crc kubenswrapper[4696]: I0318 16:50:00.469353 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564210-nn9gd" Mar 18 16:50:00 crc kubenswrapper[4696]: I0318 16:50:00.949313 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564210-nn9gd"] Mar 18 16:50:00 crc kubenswrapper[4696]: W0318 16:50:00.954657 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79c518f5_d6de_45f0_8413_13b5bfe79fdf.slice/crio-ae26b839e18e21f5a49e7a3dc0c6de9731be8e3ea4746c4e8e0b07c559b8ebe5 WatchSource:0}: Error finding container ae26b839e18e21f5a49e7a3dc0c6de9731be8e3ea4746c4e8e0b07c559b8ebe5: Status 404 returned error can't find the container with id ae26b839e18e21f5a49e7a3dc0c6de9731be8e3ea4746c4e8e0b07c559b8ebe5 Mar 18 16:50:01 crc kubenswrapper[4696]: I0318 16:50:01.514494 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564210-nn9gd" event={"ID":"79c518f5-d6de-45f0-8413-13b5bfe79fdf","Type":"ContainerStarted","Data":"ae26b839e18e21f5a49e7a3dc0c6de9731be8e3ea4746c4e8e0b07c559b8ebe5"} Mar 18 16:50:03 crc kubenswrapper[4696]: I0318 16:50:03.532274 4696 generic.go:334] "Generic (PLEG): container finished" podID="79c518f5-d6de-45f0-8413-13b5bfe79fdf" containerID="d4fd13b0a3e63368341c173b8160fd8f2af9bf2a77002a29561807676663060e" exitCode=0 Mar 18 16:50:03 crc kubenswrapper[4696]: I0318 16:50:03.532380 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564210-nn9gd" event={"ID":"79c518f5-d6de-45f0-8413-13b5bfe79fdf","Type":"ContainerDied","Data":"d4fd13b0a3e63368341c173b8160fd8f2af9bf2a77002a29561807676663060e"} Mar 18 16:50:05 crc kubenswrapper[4696]: I0318 16:50:05.030293 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564210-nn9gd" Mar 18 16:50:05 crc kubenswrapper[4696]: I0318 16:50:05.194002 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tchxj\" (UniqueName: \"kubernetes.io/projected/79c518f5-d6de-45f0-8413-13b5bfe79fdf-kube-api-access-tchxj\") pod \"79c518f5-d6de-45f0-8413-13b5bfe79fdf\" (UID: \"79c518f5-d6de-45f0-8413-13b5bfe79fdf\") " Mar 18 16:50:05 crc kubenswrapper[4696]: I0318 16:50:05.201873 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79c518f5-d6de-45f0-8413-13b5bfe79fdf-kube-api-access-tchxj" (OuterVolumeSpecName: "kube-api-access-tchxj") pod "79c518f5-d6de-45f0-8413-13b5bfe79fdf" (UID: "79c518f5-d6de-45f0-8413-13b5bfe79fdf"). InnerVolumeSpecName "kube-api-access-tchxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:50:05 crc kubenswrapper[4696]: I0318 16:50:05.296628 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tchxj\" (UniqueName: \"kubernetes.io/projected/79c518f5-d6de-45f0-8413-13b5bfe79fdf-kube-api-access-tchxj\") on node \"crc\" DevicePath \"\"" Mar 18 16:50:05 crc kubenswrapper[4696]: I0318 16:50:05.562227 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564210-nn9gd" event={"ID":"79c518f5-d6de-45f0-8413-13b5bfe79fdf","Type":"ContainerDied","Data":"ae26b839e18e21f5a49e7a3dc0c6de9731be8e3ea4746c4e8e0b07c559b8ebe5"} Mar 18 16:50:05 crc kubenswrapper[4696]: I0318 16:50:05.562601 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae26b839e18e21f5a49e7a3dc0c6de9731be8e3ea4746c4e8e0b07c559b8ebe5" Mar 18 16:50:05 crc kubenswrapper[4696]: I0318 16:50:05.562675 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564210-nn9gd" Mar 18 16:50:06 crc kubenswrapper[4696]: I0318 16:50:06.095583 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564204-6mtpb"] Mar 18 16:50:06 crc kubenswrapper[4696]: I0318 16:50:06.103838 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564204-6mtpb"] Mar 18 16:50:07 crc kubenswrapper[4696]: I0318 16:50:07.609037 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10880ed9-e38d-45c2-8267-37ef99615c30" path="/var/lib/kubelet/pods/10880ed9-e38d-45c2-8267-37ef99615c30/volumes" Mar 18 16:50:09 crc kubenswrapper[4696]: I0318 16:50:09.523490 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2zgrk/must-gather-cm5bk"] Mar 18 16:50:09 crc kubenswrapper[4696]: I0318 16:50:09.524111 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-2zgrk/must-gather-cm5bk" podUID="b7b100a7-0df9-496f-bb44-424109bd8c96" containerName="copy" containerID="cri-o://09b8c5b416289d37493dc15489994489abc85f8ed47bcd45654ea4ca6ac08bcb" gracePeriod=2 Mar 18 16:50:09 crc kubenswrapper[4696]: I0318 16:50:09.532394 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2zgrk/must-gather-cm5bk"] Mar 18 16:50:09 crc kubenswrapper[4696]: I0318 16:50:09.954709 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2zgrk_must-gather-cm5bk_b7b100a7-0df9-496f-bb44-424109bd8c96/copy/0.log" Mar 18 16:50:09 crc kubenswrapper[4696]: I0318 16:50:09.955392 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2zgrk/must-gather-cm5bk" Mar 18 16:50:10 crc kubenswrapper[4696]: I0318 16:50:10.087350 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b7b100a7-0df9-496f-bb44-424109bd8c96-must-gather-output\") pod \"b7b100a7-0df9-496f-bb44-424109bd8c96\" (UID: \"b7b100a7-0df9-496f-bb44-424109bd8c96\") " Mar 18 16:50:10 crc kubenswrapper[4696]: I0318 16:50:10.087459 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzrnl\" (UniqueName: \"kubernetes.io/projected/b7b100a7-0df9-496f-bb44-424109bd8c96-kube-api-access-xzrnl\") pod \"b7b100a7-0df9-496f-bb44-424109bd8c96\" (UID: \"b7b100a7-0df9-496f-bb44-424109bd8c96\") " Mar 18 16:50:10 crc kubenswrapper[4696]: I0318 16:50:10.094595 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7b100a7-0df9-496f-bb44-424109bd8c96-kube-api-access-xzrnl" (OuterVolumeSpecName: "kube-api-access-xzrnl") pod "b7b100a7-0df9-496f-bb44-424109bd8c96" (UID: "b7b100a7-0df9-496f-bb44-424109bd8c96"). InnerVolumeSpecName "kube-api-access-xzrnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:50:10 crc kubenswrapper[4696]: I0318 16:50:10.190162 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzrnl\" (UniqueName: \"kubernetes.io/projected/b7b100a7-0df9-496f-bb44-424109bd8c96-kube-api-access-xzrnl\") on node \"crc\" DevicePath \"\"" Mar 18 16:50:10 crc kubenswrapper[4696]: I0318 16:50:10.262683 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7b100a7-0df9-496f-bb44-424109bd8c96-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b7b100a7-0df9-496f-bb44-424109bd8c96" (UID: "b7b100a7-0df9-496f-bb44-424109bd8c96"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:50:10 crc kubenswrapper[4696]: I0318 16:50:10.292301 4696 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b7b100a7-0df9-496f-bb44-424109bd8c96-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 18 16:50:10 crc kubenswrapper[4696]: I0318 16:50:10.597110 4696 scope.go:117] "RemoveContainer" containerID="60f8abfccc6b4b7bf69c4148a2fd17d059c88d1fd099f3acbb170d79acafc5dc" Mar 18 16:50:10 crc kubenswrapper[4696]: E0318 16:50:10.597548 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:50:10 crc kubenswrapper[4696]: I0318 16:50:10.631363 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2zgrk_must-gather-cm5bk_b7b100a7-0df9-496f-bb44-424109bd8c96/copy/0.log" Mar 18 16:50:10 crc kubenswrapper[4696]: I0318 16:50:10.631820 4696 generic.go:334] "Generic (PLEG): container finished" podID="b7b100a7-0df9-496f-bb44-424109bd8c96" containerID="09b8c5b416289d37493dc15489994489abc85f8ed47bcd45654ea4ca6ac08bcb" exitCode=143 Mar 18 16:50:10 crc kubenswrapper[4696]: I0318 16:50:10.631882 4696 scope.go:117] "RemoveContainer" containerID="09b8c5b416289d37493dc15489994489abc85f8ed47bcd45654ea4ca6ac08bcb" Mar 18 16:50:10 crc kubenswrapper[4696]: I0318 16:50:10.631905 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2zgrk/must-gather-cm5bk" Mar 18 16:50:10 crc kubenswrapper[4696]: I0318 16:50:10.654123 4696 scope.go:117] "RemoveContainer" containerID="1039eb63d272641f85c38d2bea4edc2806860f49852d47de8c5b8a40ca786a58" Mar 18 16:50:10 crc kubenswrapper[4696]: I0318 16:50:10.743427 4696 scope.go:117] "RemoveContainer" containerID="09b8c5b416289d37493dc15489994489abc85f8ed47bcd45654ea4ca6ac08bcb" Mar 18 16:50:10 crc kubenswrapper[4696]: E0318 16:50:10.745717 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09b8c5b416289d37493dc15489994489abc85f8ed47bcd45654ea4ca6ac08bcb\": container with ID starting with 09b8c5b416289d37493dc15489994489abc85f8ed47bcd45654ea4ca6ac08bcb not found: ID does not exist" containerID="09b8c5b416289d37493dc15489994489abc85f8ed47bcd45654ea4ca6ac08bcb" Mar 18 16:50:10 crc kubenswrapper[4696]: I0318 16:50:10.745753 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09b8c5b416289d37493dc15489994489abc85f8ed47bcd45654ea4ca6ac08bcb"} err="failed to get container status \"09b8c5b416289d37493dc15489994489abc85f8ed47bcd45654ea4ca6ac08bcb\": rpc error: code = NotFound desc = could not find container \"09b8c5b416289d37493dc15489994489abc85f8ed47bcd45654ea4ca6ac08bcb\": container with ID starting with 09b8c5b416289d37493dc15489994489abc85f8ed47bcd45654ea4ca6ac08bcb not found: ID does not exist" Mar 18 16:50:10 crc kubenswrapper[4696]: I0318 16:50:10.745775 4696 scope.go:117] "RemoveContainer" containerID="1039eb63d272641f85c38d2bea4edc2806860f49852d47de8c5b8a40ca786a58" Mar 18 16:50:10 crc kubenswrapper[4696]: E0318 16:50:10.747137 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1039eb63d272641f85c38d2bea4edc2806860f49852d47de8c5b8a40ca786a58\": container with ID starting with 1039eb63d272641f85c38d2bea4edc2806860f49852d47de8c5b8a40ca786a58 not found: ID does not exist" containerID="1039eb63d272641f85c38d2bea4edc2806860f49852d47de8c5b8a40ca786a58" Mar 18 16:50:10 crc kubenswrapper[4696]: I0318 16:50:10.747175 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1039eb63d272641f85c38d2bea4edc2806860f49852d47de8c5b8a40ca786a58"} err="failed to get container status \"1039eb63d272641f85c38d2bea4edc2806860f49852d47de8c5b8a40ca786a58\": rpc error: code = NotFound desc = could not find container \"1039eb63d272641f85c38d2bea4edc2806860f49852d47de8c5b8a40ca786a58\": container with ID starting with 1039eb63d272641f85c38d2bea4edc2806860f49852d47de8c5b8a40ca786a58 not found: ID does not exist" Mar 18 16:50:11 crc kubenswrapper[4696]: I0318 16:50:11.606935 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7b100a7-0df9-496f-bb44-424109bd8c96" path="/var/lib/kubelet/pods/b7b100a7-0df9-496f-bb44-424109bd8c96/volumes" Mar 18 16:50:25 crc kubenswrapper[4696]: I0318 16:50:25.597752 4696 scope.go:117] "RemoveContainer" containerID="60f8abfccc6b4b7bf69c4148a2fd17d059c88d1fd099f3acbb170d79acafc5dc" Mar 18 16:50:25 crc kubenswrapper[4696]: E0318 16:50:25.599343 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:50:26 crc kubenswrapper[4696]: I0318 16:50:26.397617 4696 scope.go:117] "RemoveContainer" containerID="447387eae2a43567fe2682c878ece69baa63c120f2c9628c05c8d6bbc092462d" Mar 18 16:50:38 crc kubenswrapper[4696]: I0318 16:50:38.598382 4696 scope.go:117] "RemoveContainer" containerID="60f8abfccc6b4b7bf69c4148a2fd17d059c88d1fd099f3acbb170d79acafc5dc" Mar 18 16:50:38 crc kubenswrapper[4696]: E0318 16:50:38.599820 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:50:52 crc kubenswrapper[4696]: I0318 16:50:52.597198 4696 scope.go:117] "RemoveContainer" containerID="60f8abfccc6b4b7bf69c4148a2fd17d059c88d1fd099f3acbb170d79acafc5dc" Mar 18 16:50:52 crc kubenswrapper[4696]: E0318 16:50:52.598037 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:51:03 crc kubenswrapper[4696]: I0318 16:51:03.598046 4696 scope.go:117] "RemoveContainer" containerID="60f8abfccc6b4b7bf69c4148a2fd17d059c88d1fd099f3acbb170d79acafc5dc" Mar 18 16:51:03 crc kubenswrapper[4696]: E0318 16:51:03.603146 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:51:15 crc kubenswrapper[4696]: I0318 16:51:15.599413 4696 scope.go:117] "RemoveContainer" containerID="60f8abfccc6b4b7bf69c4148a2fd17d059c88d1fd099f3acbb170d79acafc5dc" Mar 18 16:51:15 crc kubenswrapper[4696]: E0318 16:51:15.600854 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:51:26 crc kubenswrapper[4696]: I0318 16:51:26.477992 4696 scope.go:117] "RemoveContainer" containerID="50da90607e61a5fee728f97c7be6fa03f2a20c73c494a2ceed6b3e7df2338e54" Mar 18 16:51:26 crc kubenswrapper[4696]: I0318 16:51:26.571992 4696 scope.go:117] "RemoveContainer" containerID="80b085e568e52d78e894b0950373c1ee953eecc1005f7171d857e3099c80ddcf" Mar 18 16:51:30 crc kubenswrapper[4696]: I0318 16:51:30.597191 4696 scope.go:117] "RemoveContainer" containerID="60f8abfccc6b4b7bf69c4148a2fd17d059c88d1fd099f3acbb170d79acafc5dc" Mar 18 16:51:30 crc kubenswrapper[4696]: E0318 16:51:30.598891 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:51:43 crc kubenswrapper[4696]: I0318 16:51:43.597580 4696 scope.go:117] "RemoveContainer" containerID="60f8abfccc6b4b7bf69c4148a2fd17d059c88d1fd099f3acbb170d79acafc5dc" Mar 18 16:51:43 crc kubenswrapper[4696]: E0318 16:51:43.598350 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:51:58 crc kubenswrapper[4696]: I0318 16:51:58.597967 4696 scope.go:117] "RemoveContainer" containerID="60f8abfccc6b4b7bf69c4148a2fd17d059c88d1fd099f3acbb170d79acafc5dc" Mar 18 16:51:58 crc kubenswrapper[4696]: E0318 16:51:58.599937 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:52:00 crc kubenswrapper[4696]: I0318 16:52:00.153091 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564212-h5fvt"] Mar 18 16:52:00 crc kubenswrapper[4696]: E0318 16:52:00.154814 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b100a7-0df9-496f-bb44-424109bd8c96" containerName="gather" Mar 18 16:52:00 crc kubenswrapper[4696]: I0318 16:52:00.154901 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b100a7-0df9-496f-bb44-424109bd8c96" containerName="gather" Mar 18 16:52:00 crc kubenswrapper[4696]: E0318 16:52:00.155007 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b100a7-0df9-496f-bb44-424109bd8c96" containerName="copy" Mar 18 16:52:00 crc kubenswrapper[4696]: I0318 16:52:00.155079 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b100a7-0df9-496f-bb44-424109bd8c96" containerName="copy" Mar 18 16:52:00 crc kubenswrapper[4696]: E0318 16:52:00.155194 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c518f5-d6de-45f0-8413-13b5bfe79fdf" containerName="oc" Mar 18 16:52:00 crc kubenswrapper[4696]: I0318 16:52:00.155291 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c518f5-d6de-45f0-8413-13b5bfe79fdf" containerName="oc" Mar 18 16:52:00 crc kubenswrapper[4696]: I0318 16:52:00.155624 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7b100a7-0df9-496f-bb44-424109bd8c96" containerName="copy" Mar 18 16:52:00 crc kubenswrapper[4696]: I0318 16:52:00.155738 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="79c518f5-d6de-45f0-8413-13b5bfe79fdf" containerName="oc" Mar 18 16:52:00 crc kubenswrapper[4696]: I0318 16:52:00.155819 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7b100a7-0df9-496f-bb44-424109bd8c96" containerName="gather" Mar 18 16:52:00 crc kubenswrapper[4696]: I0318 16:52:00.156587 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564212-h5fvt" Mar 18 16:52:00 crc kubenswrapper[4696]: I0318 16:52:00.159539 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:52:00 crc kubenswrapper[4696]: I0318 16:52:00.159867 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:52:00 crc kubenswrapper[4696]: I0318 16:52:00.162975 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nzrnx" Mar 18 16:52:00 crc kubenswrapper[4696]: I0318 16:52:00.165665 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564212-h5fvt"] Mar 18 16:52:00 crc kubenswrapper[4696]: I0318 16:52:00.230726 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7wv2\" (UniqueName: \"kubernetes.io/projected/3396bbcb-7e48-47b8-b5a6-972d1d4df3af-kube-api-access-k7wv2\") pod \"auto-csr-approver-29564212-h5fvt\" (UID: \"3396bbcb-7e48-47b8-b5a6-972d1d4df3af\") " pod="openshift-infra/auto-csr-approver-29564212-h5fvt" Mar 18 16:52:00 crc kubenswrapper[4696]: I0318 16:52:00.332254 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7wv2\" (UniqueName: \"kubernetes.io/projected/3396bbcb-7e48-47b8-b5a6-972d1d4df3af-kube-api-access-k7wv2\") pod \"auto-csr-approver-29564212-h5fvt\" (UID: \"3396bbcb-7e48-47b8-b5a6-972d1d4df3af\") " pod="openshift-infra/auto-csr-approver-29564212-h5fvt" Mar 18 16:52:00 crc kubenswrapper[4696]: I0318 16:52:00.351674 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7wv2\" (UniqueName: \"kubernetes.io/projected/3396bbcb-7e48-47b8-b5a6-972d1d4df3af-kube-api-access-k7wv2\") pod \"auto-csr-approver-29564212-h5fvt\" (UID: \"3396bbcb-7e48-47b8-b5a6-972d1d4df3af\") " pod="openshift-infra/auto-csr-approver-29564212-h5fvt" Mar 18 16:52:00 crc kubenswrapper[4696]: I0318 16:52:00.479017 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564212-h5fvt" Mar 18 16:52:00 crc kubenswrapper[4696]: I0318 16:52:00.966359 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564212-h5fvt"] Mar 18 16:52:01 crc kubenswrapper[4696]: I0318 16:52:01.321045 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564212-h5fvt" event={"ID":"3396bbcb-7e48-47b8-b5a6-972d1d4df3af","Type":"ContainerStarted","Data":"c9a1fcd0531aab4cc10880a65597fd825a9931da5ff5a117409b58334a59a853"} Mar 18 16:52:03 crc kubenswrapper[4696]: I0318 16:52:03.349813 4696 generic.go:334] "Generic (PLEG): container finished" podID="3396bbcb-7e48-47b8-b5a6-972d1d4df3af" containerID="4e084d0d555a3e9bb83377d5c12c12cbf3a07346f354f54a5bad578578e41650" exitCode=0 Mar 18 16:52:03 crc kubenswrapper[4696]: I0318 16:52:03.349935 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564212-h5fvt" event={"ID":"3396bbcb-7e48-47b8-b5a6-972d1d4df3af","Type":"ContainerDied","Data":"4e084d0d555a3e9bb83377d5c12c12cbf3a07346f354f54a5bad578578e41650"} Mar 18 16:52:04 crc kubenswrapper[4696]: I0318 16:52:04.788435 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564212-h5fvt" Mar 18 16:52:04 crc kubenswrapper[4696]: I0318 16:52:04.817068 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7wv2\" (UniqueName: \"kubernetes.io/projected/3396bbcb-7e48-47b8-b5a6-972d1d4df3af-kube-api-access-k7wv2\") pod \"3396bbcb-7e48-47b8-b5a6-972d1d4df3af\" (UID: \"3396bbcb-7e48-47b8-b5a6-972d1d4df3af\") " Mar 18 16:52:04 crc kubenswrapper[4696]: I0318 16:52:04.822769 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3396bbcb-7e48-47b8-b5a6-972d1d4df3af-kube-api-access-k7wv2" (OuterVolumeSpecName: "kube-api-access-k7wv2") pod "3396bbcb-7e48-47b8-b5a6-972d1d4df3af" (UID: "3396bbcb-7e48-47b8-b5a6-972d1d4df3af"). InnerVolumeSpecName "kube-api-access-k7wv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:52:04 crc kubenswrapper[4696]: I0318 16:52:04.918066 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7wv2\" (UniqueName: \"kubernetes.io/projected/3396bbcb-7e48-47b8-b5a6-972d1d4df3af-kube-api-access-k7wv2\") on node \"crc\" DevicePath \"\"" Mar 18 16:52:05 crc kubenswrapper[4696]: I0318 16:52:05.368317 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564212-h5fvt" event={"ID":"3396bbcb-7e48-47b8-b5a6-972d1d4df3af","Type":"ContainerDied","Data":"c9a1fcd0531aab4cc10880a65597fd825a9931da5ff5a117409b58334a59a853"} Mar 18 16:52:05 crc kubenswrapper[4696]: I0318 16:52:05.368379 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564212-h5fvt" Mar 18 16:52:05 crc kubenswrapper[4696]: I0318 16:52:05.368386 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9a1fcd0531aab4cc10880a65597fd825a9931da5ff5a117409b58334a59a853" Mar 18 16:52:05 crc kubenswrapper[4696]: I0318 16:52:05.852180 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564206-27ttw"] Mar 18 16:52:05 crc kubenswrapper[4696]: I0318 16:52:05.860538 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564206-27ttw"] Mar 18 16:52:07 crc kubenswrapper[4696]: I0318 16:52:07.609071 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ffbaf28-f13f-41a5-b4f5-02312fb9cecd" path="/var/lib/kubelet/pods/9ffbaf28-f13f-41a5-b4f5-02312fb9cecd/volumes" Mar 18 16:52:11 crc kubenswrapper[4696]: I0318 16:52:11.597917 4696 scope.go:117] "RemoveContainer" containerID="60f8abfccc6b4b7bf69c4148a2fd17d059c88d1fd099f3acbb170d79acafc5dc" Mar 18 16:52:11 crc kubenswrapper[4696]: E0318 16:52:11.598472 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:52:12 crc kubenswrapper[4696]: I0318 16:52:12.599823 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bqrsn"] Mar 18 16:52:12 crc kubenswrapper[4696]: E0318 16:52:12.601559 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3396bbcb-7e48-47b8-b5a6-972d1d4df3af" containerName="oc" Mar 18 16:52:12 crc kubenswrapper[4696]: I0318 16:52:12.601671 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3396bbcb-7e48-47b8-b5a6-972d1d4df3af" containerName="oc" Mar 18 16:52:12 crc kubenswrapper[4696]: I0318 16:52:12.601985 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="3396bbcb-7e48-47b8-b5a6-972d1d4df3af" containerName="oc" Mar 18 16:52:12 crc kubenswrapper[4696]: I0318 16:52:12.604119 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bqrsn" Mar 18 16:52:12 crc kubenswrapper[4696]: I0318 16:52:12.609866 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bqrsn"] Mar 18 16:52:12 crc kubenswrapper[4696]: I0318 16:52:12.766720 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d79542c5-0fc0-4901-a663-d63d5cba5fb6-utilities\") pod \"redhat-marketplace-bqrsn\" (UID: \"d79542c5-0fc0-4901-a663-d63d5cba5fb6\") " pod="openshift-marketplace/redhat-marketplace-bqrsn" Mar 18 16:52:12 crc kubenswrapper[4696]: I0318 16:52:12.767296 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d79542c5-0fc0-4901-a663-d63d5cba5fb6-catalog-content\") pod \"redhat-marketplace-bqrsn\" (UID: \"d79542c5-0fc0-4901-a663-d63d5cba5fb6\") " pod="openshift-marketplace/redhat-marketplace-bqrsn" Mar 18 16:52:12 crc kubenswrapper[4696]: I0318 16:52:12.767415 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkzwh\" (UniqueName: \"kubernetes.io/projected/d79542c5-0fc0-4901-a663-d63d5cba5fb6-kube-api-access-hkzwh\") pod \"redhat-marketplace-bqrsn\" (UID: \"d79542c5-0fc0-4901-a663-d63d5cba5fb6\") " pod="openshift-marketplace/redhat-marketplace-bqrsn" Mar 18 16:52:12 crc kubenswrapper[4696]: I0318 16:52:12.868847 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d79542c5-0fc0-4901-a663-d63d5cba5fb6-utilities\") pod \"redhat-marketplace-bqrsn\" (UID: \"d79542c5-0fc0-4901-a663-d63d5cba5fb6\") " pod="openshift-marketplace/redhat-marketplace-bqrsn" Mar 18 16:52:12 crc kubenswrapper[4696]: I0318 16:52:12.868896 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d79542c5-0fc0-4901-a663-d63d5cba5fb6-catalog-content\") pod \"redhat-marketplace-bqrsn\" (UID: \"d79542c5-0fc0-4901-a663-d63d5cba5fb6\") " pod="openshift-marketplace/redhat-marketplace-bqrsn" Mar 18 16:52:12 crc kubenswrapper[4696]: I0318 16:52:12.868960 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkzwh\" (UniqueName: \"kubernetes.io/projected/d79542c5-0fc0-4901-a663-d63d5cba5fb6-kube-api-access-hkzwh\") pod \"redhat-marketplace-bqrsn\" (UID: \"d79542c5-0fc0-4901-a663-d63d5cba5fb6\") " pod="openshift-marketplace/redhat-marketplace-bqrsn" Mar 18 16:52:12 crc kubenswrapper[4696]: I0318 16:52:12.869433 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d79542c5-0fc0-4901-a663-d63d5cba5fb6-utilities\") pod \"redhat-marketplace-bqrsn\" (UID: \"d79542c5-0fc0-4901-a663-d63d5cba5fb6\") " pod="openshift-marketplace/redhat-marketplace-bqrsn" Mar 18 16:52:12 crc kubenswrapper[4696]: I0318 16:52:12.869539 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d79542c5-0fc0-4901-a663-d63d5cba5fb6-catalog-content\") pod \"redhat-marketplace-bqrsn\" (UID: \"d79542c5-0fc0-4901-a663-d63d5cba5fb6\") " pod="openshift-marketplace/redhat-marketplace-bqrsn" Mar 18 16:52:12 crc kubenswrapper[4696]: I0318 16:52:12.887216 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkzwh\" (UniqueName: \"kubernetes.io/projected/d79542c5-0fc0-4901-a663-d63d5cba5fb6-kube-api-access-hkzwh\") pod \"redhat-marketplace-bqrsn\" (UID: \"d79542c5-0fc0-4901-a663-d63d5cba5fb6\") " pod="openshift-marketplace/redhat-marketplace-bqrsn" Mar 18 16:52:12 crc kubenswrapper[4696]: I0318 16:52:12.936738 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bqrsn" Mar 18 16:52:13 crc kubenswrapper[4696]: I0318 16:52:13.385295 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bqrsn"] Mar 18 16:52:13 crc kubenswrapper[4696]: I0318 16:52:13.459051 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqrsn" event={"ID":"d79542c5-0fc0-4901-a663-d63d5cba5fb6","Type":"ContainerStarted","Data":"659e1ae879f4d25fca51d7e3d06d39eec22fd09d917eeaa00b5cc590443dab79"} Mar 18 16:52:14 crc kubenswrapper[4696]: I0318 16:52:14.471571 4696 generic.go:334] "Generic (PLEG): container finished" podID="d79542c5-0fc0-4901-a663-d63d5cba5fb6" containerID="7baa19e0d48004d4084eca975de86bb9d8a1586fb59a841bf2a37a27ba273727" exitCode=0 Mar 18 16:52:14 crc kubenswrapper[4696]: I0318 16:52:14.471617 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqrsn" event={"ID":"d79542c5-0fc0-4901-a663-d63d5cba5fb6","Type":"ContainerDied","Data":"7baa19e0d48004d4084eca975de86bb9d8a1586fb59a841bf2a37a27ba273727"} Mar 18 16:52:16 crc kubenswrapper[4696]: I0318 16:52:16.489431 4696 generic.go:334] "Generic (PLEG): container finished" podID="d79542c5-0fc0-4901-a663-d63d5cba5fb6" containerID="2443a90f221d2ba1876e0db589685ffe4a015692260a5df27339e6100fca5a63" exitCode=0 Mar 18 16:52:16 crc kubenswrapper[4696]: I0318 16:52:16.489702 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqrsn" event={"ID":"d79542c5-0fc0-4901-a663-d63d5cba5fb6","Type":"ContainerDied","Data":"2443a90f221d2ba1876e0db589685ffe4a015692260a5df27339e6100fca5a63"} Mar 18 16:52:17 crc kubenswrapper[4696]: I0318 16:52:17.503838 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqrsn" event={"ID":"d79542c5-0fc0-4901-a663-d63d5cba5fb6","Type":"ContainerStarted","Data":"ef2b10293cdc017b38bd38f42c4b4e1cd85659fb2ba48c4850d541b10d806d77"} Mar 18 16:52:17 crc kubenswrapper[4696]: I0318 16:52:17.533902 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bqrsn" podStartSLOduration=3.089035787 podStartE2EDuration="5.533880284s" podCreationTimestamp="2026-03-18 16:52:12 +0000 UTC" firstStartedPulling="2026-03-18 16:52:14.473708781 +0000 UTC m=+4577.479882987" lastFinishedPulling="2026-03-18 16:52:16.918553278 +0000 UTC m=+4579.924727484" observedRunningTime="2026-03-18 16:52:17.521395349 +0000 UTC m=+4580.527569575" watchObservedRunningTime="2026-03-18 16:52:17.533880284 +0000 UTC m=+4580.540054490" Mar 18 16:52:22 crc kubenswrapper[4696]: I0318 16:52:22.937727 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bqrsn" Mar 18 16:52:22 crc kubenswrapper[4696]: I0318 16:52:22.938584 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bqrsn" Mar 18 16:52:22 crc kubenswrapper[4696]: I0318 16:52:22.995449 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bqrsn" Mar 18 16:52:23 crc kubenswrapper[4696]: I0318 16:52:23.629145 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bqrsn" Mar 18 16:52:23 crc kubenswrapper[4696]: I0318 16:52:23.677339 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bqrsn"] Mar 18 16:52:24 crc kubenswrapper[4696]: I0318 16:52:24.598406 4696 scope.go:117] "RemoveContainer" containerID="60f8abfccc6b4b7bf69c4148a2fd17d059c88d1fd099f3acbb170d79acafc5dc" Mar 18 16:52:24 crc kubenswrapper[4696]: E0318 16:52:24.598801 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:52:25 crc kubenswrapper[4696]: I0318 16:52:25.602677 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bqrsn" podUID="d79542c5-0fc0-4901-a663-d63d5cba5fb6" containerName="registry-server" containerID="cri-o://ef2b10293cdc017b38bd38f42c4b4e1cd85659fb2ba48c4850d541b10d806d77" gracePeriod=2 Mar 18 16:52:26 crc kubenswrapper[4696]: I0318 16:52:26.076932 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bqrsn" Mar 18 16:52:26 crc kubenswrapper[4696]: I0318 16:52:26.236923 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d79542c5-0fc0-4901-a663-d63d5cba5fb6-utilities\") pod \"d79542c5-0fc0-4901-a663-d63d5cba5fb6\" (UID: \"d79542c5-0fc0-4901-a663-d63d5cba5fb6\") " Mar 18 16:52:26 crc kubenswrapper[4696]: I0318 16:52:26.237072 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkzwh\" (UniqueName: \"kubernetes.io/projected/d79542c5-0fc0-4901-a663-d63d5cba5fb6-kube-api-access-hkzwh\") pod \"d79542c5-0fc0-4901-a663-d63d5cba5fb6\" (UID: \"d79542c5-0fc0-4901-a663-d63d5cba5fb6\") " Mar 18 16:52:26 crc kubenswrapper[4696]: I0318 16:52:26.237156 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d79542c5-0fc0-4901-a663-d63d5cba5fb6-catalog-content\") pod \"d79542c5-0fc0-4901-a663-d63d5cba5fb6\" (UID: \"d79542c5-0fc0-4901-a663-d63d5cba5fb6\") " Mar 18 16:52:26 crc kubenswrapper[4696]: I0318 16:52:26.238549 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d79542c5-0fc0-4901-a663-d63d5cba5fb6-utilities" (OuterVolumeSpecName: "utilities") pod "d79542c5-0fc0-4901-a663-d63d5cba5fb6" (UID: "d79542c5-0fc0-4901-a663-d63d5cba5fb6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:52:26 crc kubenswrapper[4696]: I0318 16:52:26.243426 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d79542c5-0fc0-4901-a663-d63d5cba5fb6-kube-api-access-hkzwh" (OuterVolumeSpecName: "kube-api-access-hkzwh") pod "d79542c5-0fc0-4901-a663-d63d5cba5fb6" (UID: "d79542c5-0fc0-4901-a663-d63d5cba5fb6"). InnerVolumeSpecName "kube-api-access-hkzwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:52:26 crc kubenswrapper[4696]: I0318 16:52:26.268570 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d79542c5-0fc0-4901-a663-d63d5cba5fb6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d79542c5-0fc0-4901-a663-d63d5cba5fb6" (UID: "d79542c5-0fc0-4901-a663-d63d5cba5fb6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:52:26 crc kubenswrapper[4696]: I0318 16:52:26.339218 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkzwh\" (UniqueName: \"kubernetes.io/projected/d79542c5-0fc0-4901-a663-d63d5cba5fb6-kube-api-access-hkzwh\") on node \"crc\" DevicePath \"\"" Mar 18 16:52:26 crc kubenswrapper[4696]: I0318 16:52:26.339271 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d79542c5-0fc0-4901-a663-d63d5cba5fb6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:52:26 crc kubenswrapper[4696]: I0318 16:52:26.339282 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d79542c5-0fc0-4901-a663-d63d5cba5fb6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:52:26 crc kubenswrapper[4696]: I0318 16:52:26.632379 4696 generic.go:334] "Generic (PLEG): container finished" podID="d79542c5-0fc0-4901-a663-d63d5cba5fb6" containerID="ef2b10293cdc017b38bd38f42c4b4e1cd85659fb2ba48c4850d541b10d806d77" exitCode=0 Mar 18 16:52:26 crc kubenswrapper[4696]: I0318 16:52:26.632498 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bqrsn" Mar 18 16:52:26 crc kubenswrapper[4696]: I0318 16:52:26.632534 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqrsn" event={"ID":"d79542c5-0fc0-4901-a663-d63d5cba5fb6","Type":"ContainerDied","Data":"ef2b10293cdc017b38bd38f42c4b4e1cd85659fb2ba48c4850d541b10d806d77"} Mar 18 16:52:26 crc kubenswrapper[4696]: I0318 16:52:26.632836 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqrsn" event={"ID":"d79542c5-0fc0-4901-a663-d63d5cba5fb6","Type":"ContainerDied","Data":"659e1ae879f4d25fca51d7e3d06d39eec22fd09d917eeaa00b5cc590443dab79"} Mar 18 16:52:26 crc kubenswrapper[4696]: I0318 16:52:26.632863 4696 scope.go:117] "RemoveContainer" containerID="ef2b10293cdc017b38bd38f42c4b4e1cd85659fb2ba48c4850d541b10d806d77" Mar 18 16:52:26 crc kubenswrapper[4696]: I0318 16:52:26.653736 4696 scope.go:117] "RemoveContainer" containerID="2443a90f221d2ba1876e0db589685ffe4a015692260a5df27339e6100fca5a63" Mar 18 16:52:26 crc kubenswrapper[4696]: I0318 16:52:26.676824 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bqrsn"] Mar 18 16:52:26 crc kubenswrapper[4696]: I0318 16:52:26.685635 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bqrsn"] Mar 18 16:52:26 crc kubenswrapper[4696]: I0318 16:52:26.760067 4696 scope.go:117] "RemoveContainer" containerID="049a8c71a695690db0006119087c047dba792e9bb0c17029c7fdafffc315b467" Mar 18 16:52:26 crc kubenswrapper[4696]: I0318 16:52:26.962387 4696 scope.go:117] "RemoveContainer" containerID="7baa19e0d48004d4084eca975de86bb9d8a1586fb59a841bf2a37a27ba273727" Mar 18 16:52:27 crc kubenswrapper[4696]: I0318 16:52:27.056451 4696 scope.go:117] "RemoveContainer" containerID="ef2b10293cdc017b38bd38f42c4b4e1cd85659fb2ba48c4850d541b10d806d77" Mar 18 16:52:27 crc kubenswrapper[4696]: E0318 16:52:27.056987 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef2b10293cdc017b38bd38f42c4b4e1cd85659fb2ba48c4850d541b10d806d77\": container with ID starting with ef2b10293cdc017b38bd38f42c4b4e1cd85659fb2ba48c4850d541b10d806d77 not found: ID does not exist" containerID="ef2b10293cdc017b38bd38f42c4b4e1cd85659fb2ba48c4850d541b10d806d77" Mar 18 16:52:27 crc kubenswrapper[4696]: I0318 16:52:27.057039 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef2b10293cdc017b38bd38f42c4b4e1cd85659fb2ba48c4850d541b10d806d77"} err="failed to get container status \"ef2b10293cdc017b38bd38f42c4b4e1cd85659fb2ba48c4850d541b10d806d77\": rpc error: code = NotFound desc = could not find container \"ef2b10293cdc017b38bd38f42c4b4e1cd85659fb2ba48c4850d541b10d806d77\": container with ID starting with ef2b10293cdc017b38bd38f42c4b4e1cd85659fb2ba48c4850d541b10d806d77 not found: ID does not exist" Mar 18 16:52:27 crc kubenswrapper[4696]: I0318 16:52:27.057073 4696 scope.go:117] "RemoveContainer" containerID="2443a90f221d2ba1876e0db589685ffe4a015692260a5df27339e6100fca5a63" Mar 18 16:52:27 crc kubenswrapper[4696]: E0318 16:52:27.057419 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2443a90f221d2ba1876e0db589685ffe4a015692260a5df27339e6100fca5a63\": container with ID starting with 2443a90f221d2ba1876e0db589685ffe4a015692260a5df27339e6100fca5a63 not found: ID does not exist" containerID="2443a90f221d2ba1876e0db589685ffe4a015692260a5df27339e6100fca5a63" Mar 18 16:52:27 crc kubenswrapper[4696]: I0318 16:52:27.057467 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2443a90f221d2ba1876e0db589685ffe4a015692260a5df27339e6100fca5a63"} err="failed to get container status \"2443a90f221d2ba1876e0db589685ffe4a015692260a5df27339e6100fca5a63\": rpc error: code = NotFound desc = could not find container \"2443a90f221d2ba1876e0db589685ffe4a015692260a5df27339e6100fca5a63\": container with ID starting with 2443a90f221d2ba1876e0db589685ffe4a015692260a5df27339e6100fca5a63 not found: ID does not exist" Mar 18 16:52:27 crc kubenswrapper[4696]: I0318 16:52:27.057496 4696 scope.go:117] "RemoveContainer" containerID="7baa19e0d48004d4084eca975de86bb9d8a1586fb59a841bf2a37a27ba273727" Mar 18 16:52:27 crc kubenswrapper[4696]: E0318 16:52:27.057765 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7baa19e0d48004d4084eca975de86bb9d8a1586fb59a841bf2a37a27ba273727\": container with ID starting with 7baa19e0d48004d4084eca975de86bb9d8a1586fb59a841bf2a37a27ba273727 not found: ID does not exist" containerID="7baa19e0d48004d4084eca975de86bb9d8a1586fb59a841bf2a37a27ba273727" Mar 18 16:52:27 crc kubenswrapper[4696]: I0318 16:52:27.057801 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7baa19e0d48004d4084eca975de86bb9d8a1586fb59a841bf2a37a27ba273727"} err="failed to get container status \"7baa19e0d48004d4084eca975de86bb9d8a1586fb59a841bf2a37a27ba273727\": rpc error: code = NotFound desc = could not find container \"7baa19e0d48004d4084eca975de86bb9d8a1586fb59a841bf2a37a27ba273727\": container with ID starting with 7baa19e0d48004d4084eca975de86bb9d8a1586fb59a841bf2a37a27ba273727 not found: ID does not exist" Mar 18 16:52:27 crc kubenswrapper[4696]: I0318 16:52:27.612177 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d79542c5-0fc0-4901-a663-d63d5cba5fb6" path="/var/lib/kubelet/pods/d79542c5-0fc0-4901-a663-d63d5cba5fb6/volumes" Mar 18 16:52:39 crc kubenswrapper[4696]: I0318 16:52:39.597657 4696 scope.go:117] "RemoveContainer" containerID="60f8abfccc6b4b7bf69c4148a2fd17d059c88d1fd099f3acbb170d79acafc5dc" Mar 18 16:52:39 crc kubenswrapper[4696]: E0318 16:52:39.598290 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:52:45 crc kubenswrapper[4696]: I0318 16:52:45.019136 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fbkj8"] Mar 18 16:52:45 crc kubenswrapper[4696]: E0318 16:52:45.020049 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d79542c5-0fc0-4901-a663-d63d5cba5fb6" containerName="extract-content" Mar 18 16:52:45 crc kubenswrapper[4696]: I0318 16:52:45.020066 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="d79542c5-0fc0-4901-a663-d63d5cba5fb6" containerName="extract-content" Mar 18 16:52:45 crc kubenswrapper[4696]: E0318 16:52:45.020094 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d79542c5-0fc0-4901-a663-d63d5cba5fb6" containerName="registry-server" Mar 18 16:52:45 crc kubenswrapper[4696]: I0318 16:52:45.020103 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="d79542c5-0fc0-4901-a663-d63d5cba5fb6" containerName="registry-server" Mar 18 16:52:45 crc kubenswrapper[4696]: E0318 16:52:45.020133 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d79542c5-0fc0-4901-a663-d63d5cba5fb6" containerName="extract-utilities" Mar 18 16:52:45 crc kubenswrapper[4696]: I0318 16:52:45.020140 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="d79542c5-0fc0-4901-a663-d63d5cba5fb6" containerName="extract-utilities" Mar 18 16:52:45 crc kubenswrapper[4696]: I0318 16:52:45.020297 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="d79542c5-0fc0-4901-a663-d63d5cba5fb6" containerName="registry-server" Mar 18 16:52:45 crc kubenswrapper[4696]: I0318 16:52:45.022001 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbkj8" Mar 18 16:52:45 crc kubenswrapper[4696]: I0318 16:52:45.044757 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fbkj8"] Mar 18 16:52:45 crc kubenswrapper[4696]: I0318 16:52:45.110806 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b732687b-4e62-4d22-b781-8f8de778cfc0-utilities\") pod \"redhat-operators-fbkj8\" (UID: \"b732687b-4e62-4d22-b781-8f8de778cfc0\") " pod="openshift-marketplace/redhat-operators-fbkj8" Mar 18 16:52:45 crc kubenswrapper[4696]: I0318 16:52:45.111144 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmlt4\" (UniqueName: \"kubernetes.io/projected/b732687b-4e62-4d22-b781-8f8de778cfc0-kube-api-access-nmlt4\") pod \"redhat-operators-fbkj8\" (UID: \"b732687b-4e62-4d22-b781-8f8de778cfc0\") " pod="openshift-marketplace/redhat-operators-fbkj8" Mar 18 16:52:45 crc kubenswrapper[4696]: I0318 16:52:45.111270 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b732687b-4e62-4d22-b781-8f8de778cfc0-catalog-content\") pod \"redhat-operators-fbkj8\" (UID: \"b732687b-4e62-4d22-b781-8f8de778cfc0\") " pod="openshift-marketplace/redhat-operators-fbkj8" Mar 18 16:52:45 crc kubenswrapper[4696]: I0318 16:52:45.213411 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b732687b-4e62-4d22-b781-8f8de778cfc0-catalog-content\") pod \"redhat-operators-fbkj8\" (UID: \"b732687b-4e62-4d22-b781-8f8de778cfc0\") " pod="openshift-marketplace/redhat-operators-fbkj8" Mar 18 16:52:45 crc kubenswrapper[4696]: I0318 16:52:45.213486 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b732687b-4e62-4d22-b781-8f8de778cfc0-utilities\") pod \"redhat-operators-fbkj8\" (UID: \"b732687b-4e62-4d22-b781-8f8de778cfc0\") " pod="openshift-marketplace/redhat-operators-fbkj8" Mar 18 16:52:45 crc kubenswrapper[4696]: I0318 16:52:45.213514 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmlt4\" (UniqueName: \"kubernetes.io/projected/b732687b-4e62-4d22-b781-8f8de778cfc0-kube-api-access-nmlt4\") pod \"redhat-operators-fbkj8\" (UID: \"b732687b-4e62-4d22-b781-8f8de778cfc0\") " pod="openshift-marketplace/redhat-operators-fbkj8" Mar 18 16:52:45 crc kubenswrapper[4696]: I0318 16:52:45.213965 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b732687b-4e62-4d22-b781-8f8de778cfc0-catalog-content\") pod \"redhat-operators-fbkj8\" (UID: \"b732687b-4e62-4d22-b781-8f8de778cfc0\") " pod="openshift-marketplace/redhat-operators-fbkj8" Mar 18 16:52:45 crc kubenswrapper[4696]: I0318 16:52:45.214192 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b732687b-4e62-4d22-b781-8f8de778cfc0-utilities\") pod \"redhat-operators-fbkj8\" (UID: \"b732687b-4e62-4d22-b781-8f8de778cfc0\") " pod="openshift-marketplace/redhat-operators-fbkj8" Mar 18 16:52:45 crc kubenswrapper[4696]: I0318 16:52:45.234049 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmlt4\" (UniqueName: \"kubernetes.io/projected/b732687b-4e62-4d22-b781-8f8de778cfc0-kube-api-access-nmlt4\") pod \"redhat-operators-fbkj8\" (UID: \"b732687b-4e62-4d22-b781-8f8de778cfc0\") " pod="openshift-marketplace/redhat-operators-fbkj8" Mar 18 16:52:45 crc kubenswrapper[4696]: I0318 16:52:45.351007 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbkj8" Mar 18 16:52:45 crc kubenswrapper[4696]: I0318 16:52:45.834292 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fbkj8"] Mar 18 16:52:46 crc kubenswrapper[4696]: I0318 16:52:46.817447 4696 generic.go:334] "Generic (PLEG): container finished" podID="b732687b-4e62-4d22-b781-8f8de778cfc0" containerID="da909c2aa3cde3975accf131ddb5051618b57e6b0f11663b1933fcd2708a3a64" exitCode=0 Mar 18 16:52:46 crc kubenswrapper[4696]: I0318 16:52:46.817495 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbkj8" event={"ID":"b732687b-4e62-4d22-b781-8f8de778cfc0","Type":"ContainerDied","Data":"da909c2aa3cde3975accf131ddb5051618b57e6b0f11663b1933fcd2708a3a64"} Mar 18 16:52:46 crc kubenswrapper[4696]: I0318 16:52:46.818022 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbkj8" event={"ID":"b732687b-4e62-4d22-b781-8f8de778cfc0","Type":"ContainerStarted","Data":"e7be8c85dd7214f690cda917f7e2a348c19d548c74daf97eebbd11b52bdae44f"} Mar 18 16:52:48 crc kubenswrapper[4696]: I0318 16:52:48.844959 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbkj8" event={"ID":"b732687b-4e62-4d22-b781-8f8de778cfc0","Type":"ContainerStarted","Data":"8c5254546d363dff104ef20c7ba49e474db04a63ef5dc3135a0314b4631c6c69"} Mar 18 16:52:52 crc kubenswrapper[4696]: I0318 16:52:52.597379 4696 scope.go:117] "RemoveContainer" containerID="60f8abfccc6b4b7bf69c4148a2fd17d059c88d1fd099f3acbb170d79acafc5dc" Mar 18 16:52:52 crc kubenswrapper[4696]: E0318 16:52:52.599221 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:52:52 crc kubenswrapper[4696]: I0318 16:52:52.885111 4696 generic.go:334] "Generic (PLEG): container finished" podID="b732687b-4e62-4d22-b781-8f8de778cfc0" containerID="8c5254546d363dff104ef20c7ba49e474db04a63ef5dc3135a0314b4631c6c69" exitCode=0 Mar 18 16:52:52 crc kubenswrapper[4696]: I0318 16:52:52.885167 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbkj8" event={"ID":"b732687b-4e62-4d22-b781-8f8de778cfc0","Type":"ContainerDied","Data":"8c5254546d363dff104ef20c7ba49e474db04a63ef5dc3135a0314b4631c6c69"} Mar 18 16:52:53 crc kubenswrapper[4696]: I0318 16:52:53.895745 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbkj8" event={"ID":"b732687b-4e62-4d22-b781-8f8de778cfc0","Type":"ContainerStarted","Data":"0d94b84d1f2844ff72ebb856c9d741e8f912faa67fc0333755453f5fb116c6c2"} Mar 18 16:52:53 crc kubenswrapper[4696]: I0318 16:52:53.920797 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fbkj8" podStartSLOduration=3.256713826 podStartE2EDuration="9.920779711s" podCreationTimestamp="2026-03-18 16:52:44 +0000 UTC" firstStartedPulling="2026-03-18 16:52:46.820053148 +0000 UTC m=+4609.826227344" lastFinishedPulling="2026-03-18 16:52:53.484119023 +0000 UTC m=+4616.490293229" observedRunningTime="2026-03-18 16:52:53.916023141 +0000 UTC m=+4616.922197347" watchObservedRunningTime="2026-03-18 16:52:53.920779711 +0000 UTC m=+4616.926953917" Mar 18 16:52:55 crc kubenswrapper[4696]: I0318 16:52:55.351710 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fbkj8" Mar 18 16:52:55 crc kubenswrapper[4696]: I0318 16:52:55.352275 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fbkj8" Mar 18 16:52:56 crc kubenswrapper[4696]: I0318 16:52:56.406888 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fbkj8" podUID="b732687b-4e62-4d22-b781-8f8de778cfc0" containerName="registry-server" probeResult="failure" output=< Mar 18 16:52:56 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Mar 18 16:52:56 crc kubenswrapper[4696]: > Mar 18 16:53:05 crc kubenswrapper[4696]: I0318 16:53:05.396105 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fbkj8" Mar 18 16:53:05 crc kubenswrapper[4696]: I0318 16:53:05.446471 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fbkj8" Mar 18 16:53:05 crc kubenswrapper[4696]: I0318 16:53:05.597758 4696 scope.go:117] "RemoveContainer" containerID="60f8abfccc6b4b7bf69c4148a2fd17d059c88d1fd099f3acbb170d79acafc5dc" Mar 18 16:53:05 crc kubenswrapper[4696]: E0318 16:53:05.598070 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:53:05 crc kubenswrapper[4696]: I0318 16:53:05.639843 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fbkj8"] Mar 18 16:53:07 crc kubenswrapper[4696]: I0318 16:53:07.011126 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fbkj8" podUID="b732687b-4e62-4d22-b781-8f8de778cfc0" containerName="registry-server" containerID="cri-o://0d94b84d1f2844ff72ebb856c9d741e8f912faa67fc0333755453f5fb116c6c2" gracePeriod=2 Mar 18 16:53:07 crc kubenswrapper[4696]: I0318 16:53:07.454112 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbkj8" Mar 18 16:53:07 crc kubenswrapper[4696]: I0318 16:53:07.645385 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b732687b-4e62-4d22-b781-8f8de778cfc0-utilities\") pod \"b732687b-4e62-4d22-b781-8f8de778cfc0\" (UID: \"b732687b-4e62-4d22-b781-8f8de778cfc0\") " Mar 18 16:53:07 crc kubenswrapper[4696]: I0318 16:53:07.645587 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmlt4\" (UniqueName: \"kubernetes.io/projected/b732687b-4e62-4d22-b781-8f8de778cfc0-kube-api-access-nmlt4\") pod \"b732687b-4e62-4d22-b781-8f8de778cfc0\" (UID: \"b732687b-4e62-4d22-b781-8f8de778cfc0\") " Mar 18 16:53:07 crc kubenswrapper[4696]: I0318 16:53:07.645932 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b732687b-4e62-4d22-b781-8f8de778cfc0-catalog-content\") pod \"b732687b-4e62-4d22-b781-8f8de778cfc0\" (UID: \"b732687b-4e62-4d22-b781-8f8de778cfc0\") " Mar 18 16:53:07 crc kubenswrapper[4696]: I0318 16:53:07.646287 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b732687b-4e62-4d22-b781-8f8de778cfc0-utilities" (OuterVolumeSpecName: "utilities") pod "b732687b-4e62-4d22-b781-8f8de778cfc0" (UID: "b732687b-4e62-4d22-b781-8f8de778cfc0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:53:07 crc kubenswrapper[4696]: I0318 16:53:07.647949 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b732687b-4e62-4d22-b781-8f8de778cfc0-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:53:07 crc kubenswrapper[4696]: I0318 16:53:07.653391 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b732687b-4e62-4d22-b781-8f8de778cfc0-kube-api-access-nmlt4" (OuterVolumeSpecName: "kube-api-access-nmlt4") pod "b732687b-4e62-4d22-b781-8f8de778cfc0" (UID: "b732687b-4e62-4d22-b781-8f8de778cfc0"). InnerVolumeSpecName "kube-api-access-nmlt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:53:07 crc kubenswrapper[4696]: I0318 16:53:07.749415 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmlt4\" (UniqueName: \"kubernetes.io/projected/b732687b-4e62-4d22-b781-8f8de778cfc0-kube-api-access-nmlt4\") on node \"crc\" DevicePath \"\"" Mar 18 16:53:07 crc kubenswrapper[4696]: I0318 16:53:07.800965 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b732687b-4e62-4d22-b781-8f8de778cfc0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b732687b-4e62-4d22-b781-8f8de778cfc0" (UID: "b732687b-4e62-4d22-b781-8f8de778cfc0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:53:07 crc kubenswrapper[4696]: I0318 16:53:07.850388 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b732687b-4e62-4d22-b781-8f8de778cfc0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:53:08 crc kubenswrapper[4696]: I0318 16:53:08.023499 4696 generic.go:334] "Generic (PLEG): container finished" podID="b732687b-4e62-4d22-b781-8f8de778cfc0" containerID="0d94b84d1f2844ff72ebb856c9d741e8f912faa67fc0333755453f5fb116c6c2" exitCode=0 Mar 18 16:53:08 crc kubenswrapper[4696]: I0318 16:53:08.023570 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbkj8" event={"ID":"b732687b-4e62-4d22-b781-8f8de778cfc0","Type":"ContainerDied","Data":"0d94b84d1f2844ff72ebb856c9d741e8f912faa67fc0333755453f5fb116c6c2"} Mar 18 16:53:08 crc kubenswrapper[4696]: I0318 16:53:08.023625 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbkj8" event={"ID":"b732687b-4e62-4d22-b781-8f8de778cfc0","Type":"ContainerDied","Data":"e7be8c85dd7214f690cda917f7e2a348c19d548c74daf97eebbd11b52bdae44f"} Mar 18 16:53:08 crc kubenswrapper[4696]: I0318 16:53:08.023648 4696 scope.go:117] "RemoveContainer" containerID="0d94b84d1f2844ff72ebb856c9d741e8f912faa67fc0333755453f5fb116c6c2" Mar 18 16:53:08 crc kubenswrapper[4696]: I0318 16:53:08.023646 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbkj8" Mar 18 16:53:08 crc kubenswrapper[4696]: I0318 16:53:08.052902 4696 scope.go:117] "RemoveContainer" containerID="8c5254546d363dff104ef20c7ba49e474db04a63ef5dc3135a0314b4631c6c69" Mar 18 16:53:08 crc kubenswrapper[4696]: I0318 16:53:08.089764 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fbkj8"] Mar 18 16:53:08 crc kubenswrapper[4696]: I0318 16:53:08.094879 4696 scope.go:117] "RemoveContainer" containerID="da909c2aa3cde3975accf131ddb5051618b57e6b0f11663b1933fcd2708a3a64" Mar 18 16:53:08 crc kubenswrapper[4696]: I0318 16:53:08.098267 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fbkj8"] Mar 18 16:53:08 crc kubenswrapper[4696]: I0318 16:53:08.153699 4696 scope.go:117] "RemoveContainer" containerID="0d94b84d1f2844ff72ebb856c9d741e8f912faa67fc0333755453f5fb116c6c2" Mar 18 16:53:08 crc kubenswrapper[4696]: E0318 16:53:08.173046 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d94b84d1f2844ff72ebb856c9d741e8f912faa67fc0333755453f5fb116c6c2\": container with ID starting with 0d94b84d1f2844ff72ebb856c9d741e8f912faa67fc0333755453f5fb116c6c2 not found: ID does not exist" containerID="0d94b84d1f2844ff72ebb856c9d741e8f912faa67fc0333755453f5fb116c6c2" Mar 18 16:53:08 crc kubenswrapper[4696]: I0318 16:53:08.173091 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d94b84d1f2844ff72ebb856c9d741e8f912faa67fc0333755453f5fb116c6c2"} err="failed to get container status \"0d94b84d1f2844ff72ebb856c9d741e8f912faa67fc0333755453f5fb116c6c2\": rpc error: code = NotFound desc = could not find container \"0d94b84d1f2844ff72ebb856c9d741e8f912faa67fc0333755453f5fb116c6c2\": container with ID starting with 0d94b84d1f2844ff72ebb856c9d741e8f912faa67fc0333755453f5fb116c6c2 not found: ID does not exist" Mar 18 16:53:08 crc kubenswrapper[4696]: I0318 16:53:08.173121 4696 scope.go:117] "RemoveContainer" containerID="8c5254546d363dff104ef20c7ba49e474db04a63ef5dc3135a0314b4631c6c69" Mar 18 16:53:08 crc kubenswrapper[4696]: E0318 16:53:08.176625 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c5254546d363dff104ef20c7ba49e474db04a63ef5dc3135a0314b4631c6c69\": container with ID starting with 8c5254546d363dff104ef20c7ba49e474db04a63ef5dc3135a0314b4631c6c69 not found: ID does not exist" containerID="8c5254546d363dff104ef20c7ba49e474db04a63ef5dc3135a0314b4631c6c69" Mar 18 16:53:08 crc kubenswrapper[4696]: I0318 16:53:08.176675 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c5254546d363dff104ef20c7ba49e474db04a63ef5dc3135a0314b4631c6c69"} err="failed to get container status \"8c5254546d363dff104ef20c7ba49e474db04a63ef5dc3135a0314b4631c6c69\": rpc error: code = NotFound desc = could not find container \"8c5254546d363dff104ef20c7ba49e474db04a63ef5dc3135a0314b4631c6c69\": container with ID starting with 8c5254546d363dff104ef20c7ba49e474db04a63ef5dc3135a0314b4631c6c69 not found: ID does not exist" Mar 18 16:53:08 crc kubenswrapper[4696]: I0318 16:53:08.176699 4696 scope.go:117] "RemoveContainer" containerID="da909c2aa3cde3975accf131ddb5051618b57e6b0f11663b1933fcd2708a3a64" Mar 18 16:53:08 crc kubenswrapper[4696]: E0318 16:53:08.182109 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da909c2aa3cde3975accf131ddb5051618b57e6b0f11663b1933fcd2708a3a64\": container with ID starting with da909c2aa3cde3975accf131ddb5051618b57e6b0f11663b1933fcd2708a3a64 not found: ID does not exist" containerID="da909c2aa3cde3975accf131ddb5051618b57e6b0f11663b1933fcd2708a3a64" Mar 18 16:53:08 crc kubenswrapper[4696]: I0318 16:53:08.182158 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da909c2aa3cde3975accf131ddb5051618b57e6b0f11663b1933fcd2708a3a64"} err="failed to get container status \"da909c2aa3cde3975accf131ddb5051618b57e6b0f11663b1933fcd2708a3a64\": rpc error: code = NotFound desc = could not find container \"da909c2aa3cde3975accf131ddb5051618b57e6b0f11663b1933fcd2708a3a64\": container with ID starting with da909c2aa3cde3975accf131ddb5051618b57e6b0f11663b1933fcd2708a3a64 not found: ID does not exist" Mar 18 16:53:09 crc kubenswrapper[4696]: I0318 16:53:09.608781 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b732687b-4e62-4d22-b781-8f8de778cfc0" path="/var/lib/kubelet/pods/b732687b-4e62-4d22-b781-8f8de778cfc0/volumes" Mar 18 16:53:18 crc kubenswrapper[4696]: I0318 16:53:18.605144 4696 scope.go:117] "RemoveContainer" containerID="60f8abfccc6b4b7bf69c4148a2fd17d059c88d1fd099f3acbb170d79acafc5dc" Mar 18 16:53:18 crc kubenswrapper[4696]: E0318 16:53:18.606182 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:53:33 crc kubenswrapper[4696]: I0318 16:53:33.598099 4696 scope.go:117] "RemoveContainer" containerID="60f8abfccc6b4b7bf69c4148a2fd17d059c88d1fd099f3acbb170d79acafc5dc" Mar 18 16:53:33 crc kubenswrapper[4696]: E0318 16:53:33.598875 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd" Mar 18 16:53:45 crc kubenswrapper[4696]: I0318 16:53:45.597983 4696 scope.go:117] "RemoveContainer" containerID="60f8abfccc6b4b7bf69c4148a2fd17d059c88d1fd099f3acbb170d79acafc5dc" Mar 18 16:53:45 crc kubenswrapper[4696]: E0318 16:53:45.598793 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jjkqr_openshift-machine-config-operator(d74b6f45-9bfc-4439-b43b-03f441c544fd)\"" pod="openshift-machine-config-operator/machine-config-daemon-jjkqr" podUID="d74b6f45-9bfc-4439-b43b-03f441c544fd"